• Quality at the intersection of human and artificial intelligence

    Time for another yvrTesting! Last month's session was a good one. We're getting the hang of the virtual presentation with a nice dynamica interaction. I think we'll never be online all the time, networking in person is just more personal. But during these times you take what you can get. Learning and improving is every bit as important today as it always has been. That said, our next topic is super interesting. Dr. Christin Wiedemann returns to present with her colleague Omar Galeano about AI and Machine Learning in Quality.

    URL: https://pqa-ca.highfive.com/yvrtesting

    Quality at the intersection of human and artificial intelligence

    Various implementations of artificial intelligence (AI), and in particular machine learning (ML), are becoming more and more ubiquitous, and as a software tester, you are increasingly likely to come across ML, whether directly or indirectly. Hence, testers need to understand what AI and ML really is, and how to adapt testing practices and approaches. In this session, Omar and Christin will provide an easy-to-understand definition of what ML is, and more importantly, what it is not. They will also talk about what we need to think about when testing ML implementations as compared to traditional software, and give examples of how we can use ML in testing. Omar will also present a case study of an AI/ML project, and the test strategies he develop to adapt to this novel context.

    Bios:

    Omar Galeano
    Omar is a Quality Engineering Principal at Slalom Build, with over 15+ years’ experience working in different quality roles. His background in electrical engineering, physics, and computer science provides him with the skills required to push the boundaries for how we test ML, and how we use ML to test. As a leader, Omar helps his teams grow, as a group and individually, providing support and promoting continuous learning.

    Dr. Christin Wiedemann
    A Ph.D. particle physicist by training, Christin uses her scientific background and analytical skills to dissect complex software solution problems. Her career in the technology sector has primarily been spent in the professional services industry, starting as a quality assurance consultant and progressing through different manager and director roles. As Practice Area Lead of Quality Engineering ay Slalom Build, she is part of an incredible team of quality advocates, working collaboratively with clients to create ground-breaking products, while simultaneously advancing quality engineering practices.

    3
  • Testing With Your Eyes Open

    Online event

    Welcome back to yvrTesting! Well that sure was some sort of Covid Hiatus wasn't it. Well we're back. I have a great topic that I think is really interesting to get it kicked off, presented by myself. I have another on deck for December that I'm really excited about. We'll do this one online, and into the future but someday we will get back together in person.

    URL: https://pqa-ca.highfive.com/yvrtesting

    Testing with your Eyes Open
    Some people (not testers) believe that testing is about having a defined script, steps that you follow one by one. The truth about good testing though is being aware, understanding how things work, how they fit together and having your eyes and understanding of how things work open enough.

    It's easy though to fall prey to test case fatigue, to end up with tunnel vision that makes it hard to notice the things in your testing that aren't in your test case path. The more times you run those tests the harder it is to keep your eyes open. I want to dig into that and help build some understanding around the problem and give you some tools to help increase your ability to find more and better issues.

    Mike Hrycyk
    Bio:

    Mike has been trapped in the world of quality since he first did user acceptance testing 23 years ago. He has survived all of the different levels and a wide spectrum of technologies and environments to become the quality dynamo that he is today. Mike believes in creating a culture of quality throughout software production and tries hard to create teams that hold this ideal and advocate it to the rest of their workmates. Mike is currently the VP Service Delivery, West for PQA Testing, but has previously worked in social media management, parking, manufacturing, web photo retail, music delivery kiosks and at a railroad. Intermittently, he blogs about quality at www.qaisdoes.com and tweets at @qaisdoes.

    4
  • Trends in Test Automation

    Online event

    Welcome back to yvrTesting! In our current world of Covid19 Social Distancing we're jumping on the online meetup bandwagon. Join us on Wednesday to talk about Automation with Dalibor Maric. We heard Dalibor speak at a conference in Vancouver in February and enjoyed his talk so we've asked him to come an give you a modified version.

    Trends in Test Automation
    Often I get question what are the trends in test automation, do companies do more of functional and load testing today and is automation usage in these areas in incline or decline. And the reality is that it is in incline, however, there is so much more to be covered in testing today than just functional flows and performance. It is so because the overall architecture of systems today is much more complex than before. There is many more pieces and different technologies involved in one single solution which demands much more complex test architecture and tools to cover it. There are many aspects of today’s SaaS system running in the cloud that cannot be tested manually and only ways to test it is through automation. More so, many systems are powered by real-time dynamic data that influences functional flows on a fly, which then places data not as a raw data in traditional way, but it is rather part of a feature, which means we cannot exercise functional flows without really understanding real life data, and then bringing it back into our testing.

    All that requires new generation of tools that enable test engineers to do full spectrum of automated testing and, moreover, to measure coverage and quality of systems under test. In many cases now, off the-shelf automation solutions are not sufficient to do proper functional, scalability or reliability testing, that’s in the past. Today the demand is to build flexible frameworks for test engineers to customize and tailor towards their specific needs. It is so because there is data-as-a-feature that needs to be analyzed and brought into testing, there is so many different ways to configure today’s systems towards customer specific needs. So, the more abstract, decoupled and easy-to-customize test framework is with as many common reusable building blocks, the more it will meet test automation needs.

    Dalibor Maric
    Bio:

    Since the start of my career in 2004 I have been introduced to test engineering and ever since it has been my focus and passion. I have started as individual contributor performing all kinds of testing including functional, scalability and localization with focus on automation. At one point I moved into leadership roles with responsibilities of building QA and Test Engineering organizations, defining test strategies and best practices, and architecting and building automation frameworks. In my testing career I have been working on Oracle’s payroll system, Microsoft’s Dynamics suite, and for last seven years I am part of Electronic Arts focusing on test engineering in gaming. I am a proud patent holder of innovative test framework that my tools team and I have architected and built at EA over the last 6 years.

    My wife and I spend most of our free time raising our two little girls, and this is what I am most proud of in my life. When I do find some free time just for me, I like to read and play soccer.

    6
  • Automation War Stories Panel

    777 Dunsmuir St

    We all agree that some form of Automation is a necessity but also that it is really a complex technological ecosphere that tries to fit itself into a pretty variable situational set of environments. All the theory in the world can only begin to help you figure out the solution to your own problem.

    Using the notion that experiential learning is king and that knowing what not to do and what can go wrong means that you are forewarned and therefore forearmed we've assembled an experienced automation panel. The panel is going to bring you some stories of Automation Gone Wrong as well as some stories about how they were able to convince the powers to be that automation is the way to go.

    In the end we expect you'll gain some knowledge that will help you with your own automation journey and without doubt you'll smile a bit along the way.

    Bios

    Jim Peers

    Jim is a Senior QA Engineer at OXD, with 18 years of experience in software development and testing in multiple industry verticals. After working in the scientific research realm for a number of years, Jim moved into software development, holding roles as tester, test architect, developer, team lead, project manager, and product manager. Jim is now back in the trenches, writing automation code, and is very interested in how the technology has progressed.

    Arash Taheri

    Arash Taheri manages a B2B Development Team in Samsung R&D Canada. With Over 13 years of industry experience in software QA, automation testing, team leadership, and project management, Arash is taking on a new challenge by leading the development team at Samsung through adopting a DevOps culture. Due to his extreme passion for quality and QA processes, Arash continues to expand his knowledge in various fields that directly impact QA practices. He has obtained certifications in Agile (CSM, CSPO) and Project Management (PMP) to broaden his vision in QA from various angles.

    Sergii Tolbatov

    Sergii Tolbatov is an accomplished IT professional with 10+ years of software development experience. He has helped building and guiding Quality Assurance teams through all phases of software development starting from inception till post production support. His experience in software Quality Assurance has led to the successful delivery of several multi-million dollar business transformation programs.

    Theresa Deering,

    A regular attendee at yvrTesting, Theresa started her testing career 3 years ago at Vision Critical. Before that she worked as a full stack web developer at Morgan Stanley (a big American bank), iLanguage Lab (a 3-person startup), Visit Scotland (the tourism board of Scotland), and Aquafadas (an e publishing company in the south of France). Now she is a staff Engineer QA at Uplight where her work contributes to reducing carbon emissions by convincing people to use less energy in their homes.

    2
  • Applying KonMari to your Automation to Achieve Tests that Spark Joy

    From untidy legacy to test codebases that spark joy inspired by applying
    KonMari - a minimalism-inspired approach to tackling your stuff
    category-by-category.

    Testing Scenario:

    You’re tasked to work on an existing test automation suite, and you note usual problems:

    - Slows down the development process
    - Test addition is an afterthought in the dev cycle
    - Doesn’t point to the problem area in code on failure
    - Doesn’t actually give you regression confidence
    - Relies on just one person to fix or run it (you!)
    - Tests get or “are tagged to be” ignored

    You need help. And the help you need is not creating a new framework, but rather learning to tidy it up so that instead of problems, your tests spark joy.

    In this talk, we learn to implement rules/techniques to achieve test atomicity, efficient DOM structure naming for tests, tagging XHR calls, the folder structure for better readability and “what tests to keep” rather than decide “what tests to discard” to keep your suite clean, lean and mean. You’ll learn that your tests, like mine, can spark joy

    Key Learning Points:

    1. Existing heavy automation frameworks can be shaped up by efficient
    DEV+QA pairing on an existing UI codebase.
    2. How to make good automation practices as a habit.
    3. How to select/keep + create limited automation tests that provide
    optimum coverage and a decent run-time.

    Bio:
    Divya Rakhiani

    She has more than 8 years of experience in Quality Assurance working in a startup like Beanworks, Service companies like Infosys and Thoughtworks and Product companies like Dell EMC. She currently heads the Quality Engineering department at a startup called Beanworks that specializes in automating the tedious and paper-heavy accounting process. It’s rated the 13th best startup in the Vancouver tech market. Divya sees herself today as a new immigrant to Canada and is building her career helping QA teams to use technologies like UI automation using cypress/selenium, test stabilization pipelines, and CI/CD for each code deployment.

    She is passionate about preaching quality as a mindset in the software development cycle and not a stage. She was an active member of the Vodqa meetup group at Thoughtworks, India and has spoken at various meetups in Delhi, India about Testing in Micro-services, contract testing, test pyramid, service virtualization, and automation best practices. When not working, Divya likes to lift weights at the gym! #girlswhocodeandlift. Most of her weekends are spent —running at the beautiful Stanley Park, walking the dog or meal prepping and daydreaming about all the shoes she can buy ;)

  • Applying KonMari to your Automation to Achieve Tests that Spark Joy

    From untidy legacy to test codebases that spark joy inspired by applying
    KonMari - a minimalism-inspired approach to tackling your stuff
    category-by-category.

    Testing Scenario:

    You’re tasked to work on an existing test automation suite, and you note usual problems:

    - Slows down the development process
    - Test addition is an afterthought in the dev cycle
    - Doesn’t point to the problem area in code on failure
    - Doesn’t actually give you regression confidence
    - Relies on just one person to fix or run it (you!)
    - Tests get or “are tagged to be” ignored

    You need help. And the help you need is not creating a new framework, but rather learning to tidy it up so that instead of problems, your tests spark joy.

    In this talk, we learn to implement rules/techniques to achieve test atomicity, efficient DOM structure naming for tests, tagging XHR calls, the folder structure for better readability and “what tests to keep” rather than decide “what tests to discard” to keep your suite clean, lean and mean. You’ll learn that your tests, like mine, can spark joy

    Key Learning Points:

    1. Existing heavy automation frameworks can be shaped up by efficient
    DEV+QA pairing on an existing UI codebase.
    2. How to make good automation practices as a habit.
    3. How to select/keep + create limited automation tests that provide
    optimum coverage and a decent run-time.

    Bio:
    Divya Rakhiani

    She has more than 8 years of experience in Quality Assurance working in a startup like Beanworks, Service companies like Infosys and Thoughtworks and Product companies like Dell EMC. She currently heads the Quality Engineering department at a startup called Beanworks that specializes in automating the tedious and paper-heavy accounting process. It’s rated the 13th best startup in the Vancouver tech market. Divya sees herself today as a new immigrant to Canada and is building her career helping QA teams to use technologies like UI automation using cypress/selenium, test stabilization pipelines, and CI/CD for each code deployment.

    She is passionate about preaching quality as a mindset in the software development cycle and not a stage. She was an active member of the Vodqa meetup group at Thoughtworks, India and has spoken at various meetups in Delhi, India about Testing in Micro-services, contract testing, test pyramid, service virtualization, and automation best practices. When not working, Divya likes to lift weights at the gym! #girlswhocodeandlift. Most of her weekends are spent —running at the beautiful Stanley Park, walking the dog or meal prepping and daydreaming about all the shoes she can buy ;)

    3
  • Lessons Learned Running Test Automation Projects

    777 Dunsmuir St

    With a last minute cancellation Michael Schoonbaert has jumped in to give us a presentation that he was planning for a future conference this season. Micheal brings to us a great depth of experience in some organisations where delivering safe software is the most important thing. This definitely influences the way he approaches his projects. It will be interesting to learn from this experience.

    In Micheal's own words...

    "Over my career in IT and Testing, I have been responsible for a number of test automation projects. For some, I did the actual work. For others, I supervised the people doing the building of the tools but provided my input in how the goals should be accomplished.

    I will talk about a lot of business practices that I have learned are helpful in driving the success of the automation project as well as your business in general.

    Prior to starting the planning process for any automation project, there are a number of things that need to be done, people who need to be involved and data that needs to be gathered. I will also discuss Change Management techniques that will help you move your project from inception to completion with higher chances of success.

    Once the pre-planning phase is complete, you should be ready to jump into the planning process. I will present a number of best practices and things to consider when moving forward with your project including Test Frameworks, Tool selection and the Proof(s) of Concept.

    Lastly, when you are ready to put your team to work and give them marching orders, I will discuss some general Best Practices for Test Automation projects to keep in mind when you move into the Development phase of the project."

    Bio:
    Micheal Schoonbaert

    Software Test Consultant with extensive experience in Information Services roles including Software Testing, Process Improvement and Test Team Leadership.

    Areas of specialty include test team management, recruiting, training, process and procedure creation and maintenance, disaster recovery, process improvement and change management using Six Sigma methodology. Six Sigma Green Belt achieved in 2013.

    Worked in a diverse set of projects related to Air Traffic Control, Medical Clinical Software, Mining, Medical Imaging, Utilities (Gas/Electric), Health and Safety Insurance and Finance Management.

    Currently working at UBC as a Test Analyst on the Integrated Renewal Project where he looks after the FIN testing on the project where the Finance, HR and Student systems are being replaced by a single SaaS system.

    5
  • Automating your Data Validation and Testing

    777 Dunsmuir St

    In business today, winners and losers in the marketplace are separated by fractions of a KPI percent. Today’s most successful, innovative, and disruptive companies rely on effectively using analytics and reporting to identify opportunities, visualize trends, and get a leg up on the competition.

    As reliance on data lakes, big data, and business intelligence systems grows, so do the risks of failing to find defects in the ingestion, ETL, or reporting systems that support those decisions.

    This session from Tricentis will discuss the pitfalls of and offer tips to help you develop a strategy for testing BI and DW environments as part of a holistic test end-to-end test automation strategy.

    Note: This won't really be a product focused presentation. As with all yvrTesting sessions it's about understanding the problems and working towards possible solutions.

    Bio:
    ROBERT GUTIERREZ
    Team Lead Solution Architecture AMS West

    Based in Vancouver, Rob is an software solution architect with experience helping organizations across all verticals and time zones improve user engagement and application experience by delivering high quality user experiences. Today, he works for Tricentis, a global leader in End-to-End Automation and Continuous Testing, where he is responsible for delivering BI / Data Warehouse test automation solutions.

    3
  • API Testing - Learning It from Someone Untrainable

    777 Dunsmuir St

    For over 20 years Kevin Morgan has been a testing professional focusing on manual testing, process improvements and the like. Although he has never shied away from the more technical aspects of his job – he can confirm results through queries like a machine – there has always been a convenient excuse getting in the way of learning how to do that newest thing – automation. It’s not that he ever denied that it was useful and should be a valuable cornerstone of any test strategy but rather there was always something more important for Kevin himself to be doing or learning at the time. Recently Kevin become convinced that this automation fad was probably here to stay and wondered what the best way to start his path of learning. Kevin knew that the tired adage of ‘can’t teach an old dog new tricks’ didn’t apply to him because he’s far too stubborn to give up easily.

    What he did find was that there are thousands of ‘best paths’ to learning this stuff out there. Hundreds of courses that claim to know what you need to know and how to get there. Upon the advice of some very wise people Kevin decided that learning API testing would be a great entry point into the automation world. It could give him some foundational skills and teach him the context to really understand some of the more advanced concepts being taught. So much of the training relied on saying things that were completely obvious to the instructor but took him 2 hours of tertiary investigation on his own just to understand.
    Kevin has succeeded in learning API testing, in spite of the seemingly best efforts of experts to confuse him and in this presentation, he’s going to help you get on the road to becoming an API expert with context, translations, explanations and in the end an enlightening demo. There might be a story or two of paths best not taken if you’re lucky.

    Bio:

    Kevin has been an IT software professional for over 20 years in a variety of industries including financial services, insurance, telecommunications, occupational health, logistics and healthcare.

    His QA career began in the early 1990’s when he was tasked with testing a new banking system that was being implemented at a financial institution where he worked. A short time later he was hired by a company to test the same banking system again as it was implemented for 157 credit unions through BC, Alberta and Saskatchewan.

    After 6 years he began his career as an independent QA professional working for the company that created the banking system.
    Since then, his work has included engagements with companies such as IBM, ICBC, Telus, Finning, Vancity, WorkSafeBC as both a Test Lead and a QA Manager on highly complex implementations and Enterprise business transformations.

    Bio:

    Reuben is a self-motivated Software Tester with a demonstrated history of working in the hospitality industry. He enjoys working with a team or individually to complete objectives and is able to adapt in high paced work environments. Organization, clear communication, time management, positive attitude and attention to detail are some of his defining attributes

    He has been with PLATO since February 14, 2019. He has worked with Trans-Link, Fortis, BC Hydro, Rain City Housing and Blue Spurs. He enjoys the amazing support from other teams that he has worked on.

    6
  • Not Your Enemy - Speaking PM - What Testers Should Know

    777 Dunsmuir St

    Project Managers are often the vilified middle-person between the sponsor and the technical doers in a project. Did you ever consider, however, that PM’s actually hold the accountability for a project’s success, including successful testing of the app/product/service/etc. Ever wonder what the meaning is behind some of the questions your Project Manager asks you? Do the inquiries ever feel repetitive, annoying, without value or purpose?
    It may surprise you to be told that project managers have the same goals that you do and aren’t actually plotting secret voodoo with the questions that they ask. They want the project to succeed just as much as you do. Jessica Evans, self described PM Geek, will break down where your PM may be coming from and lead us through the language and approach to use so we that maybe testers and PM’s can achieve some level of shared understanding and you can get back to focusing on testing!

    Bio:
    Jessica Evans is a Certified Project Management Professional (PMP) and the founder of Jocosity Management Solutions, Inc. A self-described “PM Geek”, she is genuinely enthusiastic to share project management methodologies, and how she has leveraged project management in her personal and professional life. Jessica has worked with Fortune 500 companies like Coca-Cola UK and Pokémon, along with government agencies and many others.

    The focus for her PM consultancy these days is to support businesses with portfolio management, design robust processes, and implement these along with customized team training.

    4