WeTest 2017

Sometime last month, James and I went to Wetest 2017 in Auckland. As James has been doing so much writing lately, i took a crack at writing up how i found the day! 

Slides and photos from the events can be found at

This year We-test was bigger than ever, moving into the ANZ Viaduct events centre with ~250 people attending, including most of our QA team. It was my first experience of WeTest and the WeTest community so i was excited and expecting a great day, and it didn’t disappoint. I made sure I was there a bit early and was stoked to bump into a couple of my old colleagues Franklin & John.
The venue had a really nice set up with groups of ~8 around tables, everyone had a great view of the speakers and the two large screens. This years theme ‘what now, what next’ brought together New Zealand and international star speakers to discuss where they are at and where the industry is heading.

 Bonus points for spotting both the Super Testing Bros in the crowd.

Bonus points for spotting both the Super Testing Bros in the crowd.

Key Note, Dan Billing @thetestdoctor

How to be a Redshirt, and survive (or managing bias and negativity when security testing)
Dan Billing, over from the UK, kicked us off with a really insightful talk about life as a security tester. Themed with old school star trek, Dan’s talk centered on being the role of a security officer, or a ‘red shirt’, the first people to reach the surface of a new planet putting their lives on the line and dying gruesome deaths to protect the crew. Without dying quite as often, Dan recounted some of the challenges faced by being responsible for a projects security and the mental obstacles that need to be overcome.

Key Takeaways
Diverse teams
Diverse teams may need longer to complete testing across a piece of code, but will return a broader range of issues in turn improving your security
* Modelling - use flow charts, mind maps and thread diagrams to model your application. (courtesy of @eviltester)
* Observation - observe how your application behaves when used by different groups eg. experienced, inexperienced and malicious users.
* Reflection - step back and reflect before deciding the best way forward.
* Interrogation - dive deep into, selected, risky scenarios, what happens if?
* Manipulate -manipulate data and use scripts to try to manipulate the application.
OWasp Juice shop
This is a fantastic, fun & gamified application to learn how attacks occur by both technical and social engineering.

Ardian Silvandianto

Testing as a team the sequel


Ardian shared his story of how he transformed his testing approach and that of his team. The 2 QA were regularly suffering burn out dealing with the high in-tray of tickets from the 8 developers. With the other QA going on leave for some time, Ardian used this as a time to try something new. Through his presentation we saw the planning, application and development of his teams new working method, evolving from a traditional sprint/scrum methodology and shifting left to have QA work much closer with the developers, much like we do here at Pushpay.
It’s great to see the same ideas being successful elsewhere and Ardian is still very early in his career, I look forward to hearing more from him in the future.

Sunil Kumar @sunilkumar56

Mobile App Testing for Internet of Things and Wearables

I first met Sunil in Sydney last year at CASTx17. He wasn’t speaking then, but it was great to see him up on stage sharing his experiences exploring IoT and the different challenges in testing this vertical. With many objects not having interfaces, testing is very different and requires keen observation and many new techniques. Sunil shared experiences of past endeavours, the tools he used and the importance of security in this new domain. In an anecdote about a seemingly harmless though embarrassing leak, the voices of 2 million parents recorded by a talking bear were taken in an attack, stark consequences followed when it was discovered these recordings could be used to bypass voice recognition security on bank accounts.

Daniel McClelland @DeeMickSee

Paint It Grey: Using debugging proxies to move beyond black box testing

Dan, technical lead at Trademe, treated us to an engaging talk on the concepts and benefits of using debugging proxies such as Fiddler or Charles. This is something that I do on a day to day basis, and while personally there was little for me, there was plenty for the aspiring tester to take on board. Dan demonstrated how simple and beneficial this it is to add into your testing repertoire.
Dan also is a talented musician and artist, I looked up some of his stuff and I liked, so I’ll give him a little plug for that too. you can hear the review of his debut album here:


Emma Barnes

Management, ethics and inclusion in testing

Emma gave us an open and honest talk about a topic that is really relevant to any industry. I’m super proud to work in an industry that I feel is more inclusive and ethical than many others, but we never rest on our laurels. Just because we’re good at this, comparatively, does not mean it’s solved and talks like this help keep us aware and motivated to provide safe and inclusive work environments for all.
The big take away I got from this talk was to accept the inevitable, you/I will screw up, there will be times we get it wrong and hurt people. We know that’s not our intentions, but how we apologise, recover and grow from these mistakes is key. As a tall white English male , i’m aware that I am experiencing the world from a privileged perspective and very paranoid about hurting people. I’m not going to be any less self aware about this, but i can make peace with myself over mistakes now I know how to apologise properly.
* Just Say ‘I’m Sorry’ and thank them for the feedback
* Make a plan to change your actions and talk about it with the person involved.
* Change your actions. (This bit is super important, do not apologise just for the show.)


Kathryn Hempstalk

Machine Learning for Testers

As a Data scientist, Kathryn spoke about the need for testing to extend over the work of her domain, adding quality to the data we analyse is critically important to making fully informed business decisions. When building a machine learning algorithm, data analysts use small data sets to build the rules. The integrity of the data in these rules is vitally important as it will shape the rules that are applied to the rest of the data.
Another fascinating example of the use of data science and machine learning was login services, analysing a user’s typing style to help determine if a user’s credentials have been compromised. They do this without having the sample data of other people - as a tester I can see so many potential issues in this solution, what if the user has an injury and can’t type normally? How can you test other typing styles to tell if the attack recognition works?
It is always really fascinating to hear from other business aspects at conferences, it broadens our horizons and let’s us peek into their world. I wonder if we as testers appear at conferences for other professions in the same way?

Lean Coffee and Games

In-between lunch and afternoon tea we had a lean coffee session. For those who don’t know, check this out here
I was sitting at a leadership table, we covered a number of topics but something that really stuck with me was that everyone wanted to empower and enable their teams to learn and develop. For myself, I feel privileged and fortunate to work in a job I enjoy. However I was left frustrated by how some attitudes around the table seemed negative and archaic, in particular the view that learning should be done outside of work hours and when there is time in office it cannot be done on project work. I feel quite strongly against this view, and i think it’s worth raising in a podcast so look out for that!

Yvonne Tse @1yvonnetse & Rupert Burton @rupert_burton

Testing is a mindset, not a role - from testing to user research

Another great non-testing talk, this time we gained a valuable insight into the user research role courtesy of Rupert and Yvonne. We’re lucky here to work closely with our UX team, it certainly hasn’t been the case in previous roles and even so there was still a lot of new insights here. The talk really highlighted the existing and separate skills between testing and User research, and I was surprised to see how many we share. Another fascinating stat was around usability studies, and how at ~5 responses you tend to cover 85% of the issues that user research can find. I was really surprised by how small this number was - but not that many companies refuse to trust this observation and go on spending money on more testing.
It was also great to see how they use AR/VR and other new technologies to power up the UX process.

Samantha Connelly @sammy_lee12

Using robots to test mobile apps

Sammy gave us a talk in two parts, first up we got to see Tappy McTapface in action, a 3d printed bot using Arduino & Johnny-five javascript API. It was awesome to see Tappy strut it’s stuff and play us some music, and I can definitely see some useful applications for it in device testing, although I think it’s a couple of years away from the mainstream yet. I’d love to see a company really throw some resource into this tech and see the benefits it brings.
In the 2nd half of her talk Sammy showed how she organises the work or bugs that are outstanding on a project, It’s a concept i’ve used myself before but with an extra twist I really liked. The concept relies on plotting out the impact vs likelihood of a particular bug happening to end users.
Where the tasks fall determines the priority - fixing the ones with highest likelihood & severity first. At the bottom of the ladder, you could actually result in removing functionality if it is so unlikely to be used.
The extra twist I really liked was the addition of the colour spots to denote additional context such as security or compliance issues, this might bump up the priority of the task.


Lock Note - Angie Jones @techgirl1908

Owning our Narrative

We ended the day with a fantastic, energetic talk from Angie Jones. She recounted the history of recorded music, how the invention of the gramophone and subsequent technologies affected how musicians entertained. Shortening tracks, changing instruments and adapting to uncertain times. The role of the musician never changed, but the way the performed it did. As testers our methodologies, our tools, the subject of the testing all changes so we must adapt and embrace change to succeed. But also so must our tools and processes!
Our community, particularly in Auckland has a reputation of bemoaning and fighting change, this talk was a fantastic eye opener to welcome change and grow with it.
Key takeaways
* Don’t fight the change, you’ll just be playing catchup
* Adapt your tools and processes to change with you
* Embrace change

The conference was a fantastic day all round. I would have loved to speak to more people outside of those I already knew and swap more ideas, please comment below your thoughts and experiences of the conference!