Category Archives: Track

This category contains all track presentations.

Testing as a Driver for Development Change

Abstract:

You know the feeling, you’ve been working as a testing manager for several years, you and your team have established a testing process, and then worked on improving and refining it, but you’ve reached the point where you can’t achieve any more. It is not that your processes are perfect, they aren’t, but you realise that to make further improvements in the testing process you have to change the way development works, and that is a far harder task.

So how do you go about changing the development lifecycle? Key areas to address to make progress with testing are; documentation, volume of change, design for testing, quality assurance/quality control and project management. Then you have the traditional problems of the relationship between the developers and the testers, organisational priorities assigned to testing, and the commercial realities of a software house.

This presentation uses as a case study the work carried out by Graham over the last year, starting with identification of the problems, kicking-off a change program, and then details the initiatives that have resulted, including the definition of a test friendly development lifecycle. To add to the complexity the development group have been investigating agile methodologies such as XP.

Downloads:

PowerPoint      pdf

Presented at:

1. BCS SIGiST, London – May 2002
2. EuroSTAR, Edinburgh – Dec 2002

When to Compromise on Testing

Abstract:

Several times over the last few years of working as a tester, I have found myself making compromises on the way that I have been testing, and generally felt very uncomfortable about doing so. Everyone will tell you that compromise in testing is inevitable, but that never makes it any easier. It is never possible to get the perfect mix of resources, skilled testers, equipment to test upon, enough time to plan and prepare for testing, or even to run all of the test scripts, let alone re-test all of the software fixes.

Managers are forever telling you that when they used to write and test software they did it this way, or that way, someone else will suggest that you aren’t using the right toolset, and even your own testing team may disagree with the general direction or method. Notwithstanding all of that, and the fact that developers don’t make mistakes do they, the users will then blame you personally for every bug that they find!

This talk does not offer a silver bullet solution, but will take you through the testing lifecycle, identifying the areas where compromise is most commonly called for, and show you the techniques that I have found successful in managing and controlling that compromise without losing integrity. And also a few of the pitfalls!

Downloads:

PowerPoint

Presented at:

1. BCS SIGiST, London – Sep 2000
2. EuroSTAR, Copenhagen – Dec 2000

System Testing In A Hurry

Abstract:

What would you do if you were given this challenge?

Hi Graham, we have a project which finished development last Friday and starts system testing today (Monday). Unfortunately all of our testers are preoccupied with Year 2000 projects and we can’t spare them, so we thought that we would ask the development team to carry out the system testing. Can you talk to the team for a couple of hours and tell them everything they need to do ‘System Testing In A Hurry’? . . . .How long do you need to prepare? . . . .You have one day!

This talk will focus on the presentation made to the development team that Tuesday, covering the essentials; planning, resources, testing approach, testing techniques, test preparation, test execution, fault management and progress reporting, detailing what to do, and more importantly, what not to do.

Additionally the talk includes feedback from the development team, a report on how the testing progressed and a rather surprising outcome!

Downloads:

PowerPoint

Presented at:

1. BCS SIGiST, London – Dec 1999

Time-Boxed Testing

Abstract:

There is great pressure upon developers today to improve productivity and effectiveness. To achieve this there is a move away from the traditional structured methodologies towards more dynamic, iterative and RAD approaches.

This is being combined with Object and Component based techniques, and delivered with a new generation of IDE’s, to produce thin client, web based, voice and data products.

Downloads:

PowerPoint      Film      Film      Film      Film
yes, 4 animations

Presented at:

1. BCS SIGiST, London – July 1998

Addendum:

When this presentation was first given back in 1998, the conference world was in transition from overhead foils to laptops and projectors. The presentation software of the time, in this case Powerpoint, struggled with the most basic of embedded objects. That is why the supporting animations are presented separately.
It is worth noting that although the abstract is quite brief, this presentation covered a large amount of conceptual ground, supported by real life experience.

Practical Test Monitoring

Abstract:

Manager:   Give me a testing status report !
Tester:       It feels OK . . .
or              Its a bit dodgy . . .
or              “We’ve raised a lot of Problems. . .
Manager:   Be objective, quantify your answer, give me numbers, give me something to manage.

As the Tester you are left feeling that you could do more ! So how do you monitor testing without doing anything complex, arduous or tedious, but still give your Manager an objective and quantifiable report?

This talk aims to demonstrate simple, effective and practical ways to monitor testing, covering; Test Planning, Test Scripting, Test Execution and Fault Reporting, by identifying what to monitor, when to monitor it and how to report it.

Using as a case study two similar sized projects, the first an enhancement, the second a new build, this talk will show you how to monitor testing. This is not rocket science, but practical, pragmatic and effective reporting.

Downloads:

PowerPoint

Presented at:

1. BCS SIGiST, London – Sep 1997
2. DEVTest 98, London – Nov 1998

UAT The Hard Way

Abstract:

The aim of this talk is to relate to the SIGiST direct experience of User Acceptance Testing. Sound planning, structured testing, following the V model, introducing traceability throughout the lifecycle and gathering metrics isn’t enough. You actually have to do the testing and deliver as near to fault free a product as is possible.

Unfortunately nothing ever goes smoothly, and the this talk will identify where in User Acceptance Testing problems can occur, what they might be and how to resolve some of them.

So why     The Hard Way?”
Answer    “Because if it was easy someone would have already done it!

Downloads:

PowerPoint

Presented at:

1. BCS SIGiST,  London – May 1996

 

RAD GUI Testing

Abstract:

This presentation relates direct experience of testing GUI applications using a RAD development methodology. It discusses the differences between RAD and structured (Waterfall) development, outlines an approach to testing in a RAD environment, and highlights some of the pitfalls or lessons learnt.

Downloads:

P

Presented at:

1. BCS SIGiST, London 1995