Update! Speakers & Talks for GTAC 2010

We are thrilled to announce the speakers and talks for the 5th Google Test Automation Conference (GTAC). This year’s event will have a total of 11 talks. This includes the three keynotes that we announces earlier and eight other talks. These talks span the three sub-categories of Testing, Testability and Test Automation which are an integral part of this year’s theme “Test to Testability”.

As we had shared earlier, for this year’s GTAC we used a new process of letting the selected attendees vote on the talks they wanted be a part of GTAC. The committee tallied the votes and ensures a healthy distribution between topics and participants from across the globe and relevance to our theme. We received over 80 submissions and have an acceptance rate of about 10%. Our thanks to everyone who submitted a proposal and all the attendees who voted to make this a successful process.

Here is the list of talks. More details can be found at the select talks page on the GTAC site.

Category: Testing
  • Early Test Feedback by Test Prioritisation (Shin Yoo, University College London & Robert Nilsson, Google Inc.)
  • Crowd-source testing, Mozilla community style (Matt Evans, Mozilla)
  • Measuring and Monitoring Experience in Interactive Streaming Multimedia Web Applications (Shreeshankar Chatterjee, Adobe Systems India)
Category: Testability
  • Flexible Design? Testable Design? You Don’t Have To Choose! (Russ Rufer and Tracy Bialik, Google Inc.)
  • Git Bisect and Testing (Christian Couder)
  • Lessons Learned from Testability Failures (Esteban Manchado Velazquez, Opera Software ASA)
Category: Test Automation
  • The Future of Front-End Testing (Greg Dennis and Simon Stewart, Google Inc.)
  • Twist, a next generation functional testing tool for building and evolving test suites (Vivek Prahlad, ThoughtWorks)
For further information on the conference please visit its webpage at http://www.gtac.biz.

Sujay Sahni for the GTAC 2010 Committee
1 comment

By James Whittaker

First and foremost, apologies to all of those trying to get to our NY event who weren't able to do so. It was an absolutely packed house, frankly the popularity of it overwhelmed us! Clearly the mixture of a Google tour, Google goodies, food, drink and testing is an intoxicating cocktail.
By James Whittaker

First and foremost, apologies to all of those trying to get to our NY event who weren't able to do so. It was an absolutely packed house, frankly the popularity of it overwhelmed us! Clearly the mixture of a Google tour, Google goodies, food, drink and testing is an intoxicating cocktail.

The event was not taped but GTAC will be and I'll likely not have been part of a two hour party before that talk! Some things, I think, are better off unrecorded and off the record...

We will be having more of these events in the future. We'll learn from this and make sure you have plenty of warning.

Thanks for understanding and if any rumors emerge from this event about things I may have said on stage...you can't prove anything!

By James Whittaker

Ever look at a testing problem and wonder how to solve it? If so you know what it feels like to lack domain expertise. Sometimes this is user-oriented knowledge. Testing a flight simulator requires knowledge of how to fly a plane. Testing tax preparation software requires knowledge of accounting. Other times the knowledge is more problem-oriented. Testing a mobile operating system means understand how Wi-Fi and device drivers work. Whenever the bill of materials contains a testing problem that the risk analysis identifies as important, the expertise needed to test it needs to be on the testing team. Hire it, contract it, outsource it. Whatever it takes to ensure that people who know what they are doing and have experience doing it are on staff for the duration of the project. There is no technological substitution for expertise.
By James Whittaker

Ever look at a testing problem and wonder how to solve it? If so you know what it feels like to lack domain expertise. Sometimes this is user-oriented knowledge. Testing a flight simulator requires knowledge of how to fly a plane. Testing tax preparation software requires knowledge of accounting. Other times the knowledge is more problem-oriented. Testing a mobile operating system means understand how Wi-Fi and device drivers work. Whenever the bill of materials contains a testing problem that the risk analysis identifies as important, the expertise needed to test it needs to be on the testing team. Hire it, contract it, outsource it. Whatever it takes to ensure that people who know what they are doing and have experience doing it are on staff for the duration of the project. There is no technological substitution for expertise.

It doesn't matter how good you think you are at exploratory testing, if you don't understand how something works find someone who does.

Google is holding a testing event in our NY office Wednesday, September 15 at 5:30pm. This includes a tour of our local offices and a live talk on how Google does testing by our own James Whittaker. Rumor has it he's using an early version of his GTAC talk. Lots of food, drink and Google giveaways.
Google is holding a testing event in our NY office Wednesday, September 15 at 5:30pm. This includes a tour of our local offices and a live talk on how Google does testing by our own James Whittaker. Rumor has it he's using an early version of his GTAC talk. Lots of food, drink and Google giveaways.

By James Whittaker

Possessing a bill of materials means that we understand the overall size of the testing problem. Unfortunately, the size of most testing problems far outstrips any reasonable level of effort to solve them. And not all of the testing surface is equally important. There are certain features that simple require more testing than others. Some prioritization must take place. What components must get tested? What features simply cannot fail? What features make up the user scenarios that simply must work?
By James Whittaker

Possessing a bill of materials means that we understand the overall size of the testing problem. Unfortunately, the size of most testing problems far outstrips any reasonable level of effort to solve them. And not all of the testing surface is equally important. There are certain features that simple require more testing than others. Some prioritization must take place. What components must get tested? What features simply cannot fail? What features make up the user scenarios that simply must work?

In our experience it is the unfortunate case that no one really agrees on the answers to these questions. Talk to product planners and you may get a different assessment than if you talk to developers, sales people or executive visionaries. Even users may differ among themselves. It falls on testers to act as the user advocates and find out how to take into account all these concerns to prioritize how testing resources will be distributed across the entire testing surface.

The term commonly used for this practice is risk analysis and at Google we take information from all the projects stakeholders to come up with overall numerical risk scores for each feature. How do we get all the stakeholders involved? That's actually the easy part. All you need to do is assign numbers and then step back and have everyone tell you how wrong you are. We've found being visibly wrong is the best way to get people involved in the hopes they can influence getting the numbers right! Right now we are collecting this information in spreadsheets. By the time GTAC rolls around the tool we are using for this should be in a demonstrable form.