Test Focus Group:Test in Production and Web Test

Details
Note: I charge $3 to cover the meetup expense (membership fee, food, door prize,etc..). due to i can't setup payment here, you can pay at https://www.meetup.com/seattle-software-test-talk/events/77904932/ or at door.
=== Agenda =====
6:00pm - 6:20pm, Pizza & social
6:20pm - 6:30pm, Introduction
6:30pm - 7:30pm, A to Z Testing in Production: Industry Leading Techniques to Leverage Big Data for Quality, by Seth Eliot from Microsoft.
Testing in production (TiP) is a set of software methodologies that derive quality assessments not from test results run in a lab but from where your services actually run - in production. The big data pipe from real users and production environments can be used in a way that both leverages the diversity of production while mitigating risks to end users. By leveraging this diversity of production we are able to exercise code paths and use cases that we were unable to achieve in our test lab or did not anticipate in our test planning.
This session introduces test managers and architects to TiP and gives these decision makers the tools to develop a TiP strategy for their service. Methodologies like Controlled Test Flights, Synthetic Test in Production, Load/Capacity Test in Production, Data Mining, Destructive Testing and more are illustrated with examples from Microsoft, Netflix, Amazon, and Google. Participants will see how these strategies boost ROI by moving focus to live site operations as their signal for quality.
7:30pm - 8:30pm, Feedback-Driven Automated Testing for Web Application, by Scott McMaster from Google.
Creating automated tests that exercise a web application through a browser is a challenging and time-consuming process. The state space of possible interactions in a web application is huge. Test case maintenance is also a significant issue since small changes to the web application interface can break large numbers of tests. Real-world industrial web applications pose unique challenges to test automation tools. The applications themselves are often very large, with the number of HTML elements running in the hundreds or thousands. Because the state space of possible interactions between these elements is huge, efficiency and runtime performance of the automated testing framework are critical. And because industrial web applications undergo frequent user interface changes, it is important that tests are relatively stable and maintainable over a long period of time, or at least can be easily regenerated by the automation. This session will present a new automated testing tool, WebTestingExplorer, which is available as open source. We will discuss the challenges and opportunities in automated web application testing, and see how WebTestingExplorer can be used to enable model-driven and feedback-driven automated testing, touching on most aspects of the testing process including oracle generation, defect detection, test case generation, and replay.
8:30pm - 9:00pm, Lucky draw & Mingle
Speakers Bio:
Seth Eliot is Senior Knowledge Engineer for Microsoft Test Excellence focusing on driving best practices for services and cloud development/testing across the company. He previously was Senior Test Manager, most recently for the team solving exabyte storage and data processing challenges for Bing, and before that enabling developers to innovate by testing new ideas quickly with users “in production” with the Microsoft Experimentation Platform ( http://exp-platform.com ). Testing in Production (TiP), software processes, cloud computing, and other topics are ruminated upon at Seth's blog at http://bit.ly/seth_qa and on Twitter (@setheliot). Prior to Microsoft, Seth applied his experience at delivering high quality software services at Amazon.com where he led the Digital QA team to release Amazon MP3 download, Amazon Instant Video Streaming, and Kindle Services.
Scott McMaster is a Software Design Engineer at Google in Kirkland, working on Google Code ( http://code.google.com ). He has taught object-oriented programming and software architecture and design as an adjunct professor in the Master of Software Engineering program at Seattle University. Prior to Google, Scott worked as a software engineer, architect, and test engineer at Microsoft, Lockheed Martin, Amazon.com, and a couple of small startups. He has a Ph.D. in Computer Science from the University of Maryland, where his thesis presented a novel approach for test coverage and test suite maintenance. His web site is http://www.scottmcmaster365.com/ .
We are actively seeking new sponsors and speakers for future meetups...

Test Focus Group:Test in Production and Web Test