Agile Testing @DUS #1

Dies ist ein vergangenes Event

13 Personen haben teilgenommen

ConSol Consulting & Solutions Software GmbH

Kanzlerstraße 8 · Düsseldorf

Wie du uns findest

Anfahrt mit den Öffentlichen: U71/U72 S6 - Meetup im Erdgeschoss - Räumlichkeit ist ausgeschildert.

Bild des Veranstaltungsortes

Details

Es ist soweit! Das erste Agile Testing Meetup in Düsseldorf steht in den Startlöchern.

Geplant ist ein Abend am[masked] mit Pizza, Bier und zwei spannenden Talks rund ums Thema Testing!

18:30 Uhr - Meet & Greet mit Catering
19:00 Uhr - Welcome
19:10 Uhr - Containerized Test Automation (Sven Hettwer)
20:00 Uhr - Agile Testing vs. Microservices - What could possibly go wrong? (Oliver Weise)
20:30 Uhr - Open End / Networking

Containerized Test Automation:
The focus of this talk is to show how to reduce the effort of software testing by using Open Source software efficiently in a state of the art environment. Beside a brief introduction into the challenges of efficiently testing microservices in a cluster environment, the talk gives a short introduction into concepts like OpenShift build pipelines using Jenkinsfiles as well as insights of how to test various interfaces such as HTTP, Apache Kafka and JDBC automatically during the deployment pipeline of a sample microservice project. The integration tests will be using the Open Source integration testing framework Citrus.

Sven Hettwer arbeitet als Softwareentwickler bei der ConSol Software GmbH am Standort Düsseldorf. Er beschäftigt sich seit seinem Studium (B.Sc. Informatik) intensiv mit Themen wie Softwarequalität und Automatisierung und konnte seit seinem Einstieg in die Wirtschaft vor ca. 3 Jahren bereits Erfahrungen in unterschiedlichsten (z. T. Open Source) Projekten sammeln. In letzter Zeit beschäftigt sich Sven vor allem mit Testautomatisierung in Verbindung mit Containertechnologien und CI/CD.

Agile Testing vs. Microservices - What could possibly go wrong:
So we are all embracing agile testing by now, right? We ensure testable stories in our "definition of ready", we define acceptance criteria in the sprint planning and we use great testing tools to ensure that they are met in an automated fashion. This by itself allows us to release frequently, independently and with good confidence in the quality of our product.
On the other hand - true story - we might be working for a massive IT ecosystem containing loads of microservice modules that interact with you and other services in a very complex manner and which are generally "moving targets" as they are also evolving while we are developing on our side. And not all of them are as reliable regarding availability and interface strictness as would be desirable.
In such an environment testing your own isolated module against a mocked outside and defining interface contracts will only take you so far. Sometime before going live you will want to test against actual backends, not mocks, and see what your contracts are worth before risking a big meltdown. This talk takes a quick look at our quest to organize these end-to-end tests in the described project, what can go (and has gone) wrong, and what solutions are to evaluate.

Oliver Weise leads a team of software engineers for Consol GmbH. His main topics right now all have to do with developing cloud native applications, like planning software architecture and infrastructure, as well as doing DevOps and test automation for the cloud.

Wir freuen uns auf euch!