Showing posts with label coaching. Show all posts
Showing posts with label coaching. Show all posts

Thursday, 6 March 2014

I'm going to.....

So taking John's example I thought I'd share my plans for this year so far.




As far as I'm aware there are still tickets available for all but get in quick.

NB. Tickets not needed for the #TesterGathering

Monday, 8 April 2013

Explaining testing: 101 Tactics For Revolutionaries

Here are the first 10 to get you started.

Onward to glory!

  1. if you’re in charge, do it yourself
  2. if you’re not in charge, do it yourself
  3. become known as “the guy who…” so when the time is right, everyone knows there’s a guy who…
  4. learn to be nice, so people like you
  5. realise there are no rules, you can do what you like
  6. know that you are as right as you can be for now given what you’ve learnt so far
  7. know that this is the same for everybody else
  8. stay on the inside of the wrong thing so you can speak with authority on why and how it is wrong
  9. know it’s not a race. That you can divide the world into those ahead of you and those behind, and to all those ahead of you, you’re the one behind.
  10. be an entrepreneur not a crusader
The rest are here: 101 Tactics For Revolutionaries

Wednesday, 27 March 2013

No time left at the end of the sprint for proper testing

In the Agile Testing Linkedin Grp the following was posted:

No time left at the end of the sprint for proper testing

Designers tend to add and change code until the end of a sprint, not leaving enough time to do all the agreed testing. At the start of a sprint, we assign rough time estimates to user stories, taking both design and test activities into account. Some tests are automated and run during the night.
However, other tests need manual preparation of data and partly manual execution and result analysis. There is also some amount of exploratory testing involved. During the sprint, there always seems to be a reason not to deliver yet to test: fixes, improvements and design updates. At the end of the sprint, little time is left for manual testing, far too less for running the tests, analyzing and repairing eventual bugs, retest and results logging.

What advise do you have for me, so that I can claim and really use a fair amount of the sprint time for testing?

With a follow up post of:
What I called 'delivery' is not a heavy weight process wall. It is just a oral notification in the team stand up meeting that some story is ready for test. Our way of working is pretty much in line with all points you mention: except for point 5. "Testing is a fair amount of the sprint if done well". I think 1 day left for testing out of a 2 week sprint is not this 'fair' amount. The pattern that I have to cope with is: several user stories are coded in parallel and they tend to be 'ready for test' all at the same time, that is: 1 day before sprint end. The tester is involved in functionality and architectural discussions during the sprint and prepares test data and test scripts, ready to 'push the button' when a story is ready.

My (currently unpublished) comment (with minor changes):
So, based on the info you've provided I'm going to make a bunch of suppositions along with ask a number of questions.

1. Is there a definition of 'done'? Does it include testing? If so, it seems like it's being ignored?
* If it is being ignored are there retrospectives held? What happens when this is brought up?
* Is the issue being recognised by the rest of the team?
* Is it being recognised and cared about?
* Are the powers that be aware?

2. Are the stories broken up into tasks? If so is it possible to test the tasks?

3. If what you are working on is broken up into (small) stories and setting aside the late adjustments there should be a constant stream of stories coming through, if not, has this been looked at? If so, what was the outcome?

4. Is it possible for team members to pair? IE testers and devs, ba's and testers, ba's and devs, etc.

5. Is there a visual representation of story process? Visible by everybody?

6. Is this way of working new to the team/company? Was there help making the transition? If there was, were they any good? Were any new people with more experience in this way of working hired?

7. Are you/the testers prepared to play hard ball? You can't possibly test a sprints worth of work in a day, so don't try.

8. How are the late adjustments getting into the story? They should be judged on value and as a team decided on whether or not they get into the sprint. Failing that then a story/stories can be dropped to allow for changes.

9. Is there a scrum master type role? Is he/she someone who has gone and gotten the CSM or are they experienced?
* Experience is very hard to judge, how is it done?

10. Is there a way to prepare test data through automation?

11. Is there any skill sets lacking in the team in general?

It doesn't seem like you have a testing issue, you have a team/culture/mindset issue.



I'd like to know what I've missed?
 


 

Wednesday, 6 February 2013

It doesn't make sense.


I stole this, I changed two words:

People work with one set of ideas about how the software is. Everything they do, be it experimental or theoretical work, is informed by, and framed within, that set of ideas. There will be some evidence that doesn't fit, however. At first, that evidence will be ignored or sabotaged. Eventually, though, the anomalies will pile up so high they simply cannot be ignored or sabotaged any longer. Then comes crisis.
13 Things That Don't Make Sense - Michael Brooks.

To me, this is a pretty good explanation of software development, although of course, not in all cases of software development.

It's also a pretty good reason why things like agile, devops, devs, bdd, etc have come about.

We do approach things with a set of ideas and we do frame things with that set of ideas in mind.

We stick to our own ideas, even though some of our ideas have been born out of others' ideas and thoughts and words and we've blindly made them our ideas and thoughts.
- For more on this train of thought refer to Leprechauns of Software Development or various kinds of certification.

When we have ideas that we have actually conceived it can be a good thing because we all have different experiences, we all have different thoughts, we can all add something.

I think the problems occur when we don't let go of theses ideas (when beneficial) and learn from others experiences and listen to others ideas.

A lot of time we don't conceive ideas together for something we are supposed to be working on together.

What's wrong with us?

Doesn't make sense to me.

Make sense to you?

Continuing with the excerpts from 13 Things That Don't Make Sense The next paragraph starts with the sentence:

Crisis, Kuhn said, is soon followed by the paradigm shift in which everyone gains a radically new way of looking at the world.

Does it? Not for software development, not as much as needed.

In the context of software development the sentence would read:

Crisis, Kuhn said, is soon followed by a attempt to throw more people at, work longer hours to stem and follow the procedures that caused the crisis in the first place until the next crisis arrives.

What's wrong with us?






Thursday, 10 January 2013

Weeknight Testing....BRB..

So a while back a bunch of clever people started this thing called Weekend Testing

About us
WT formerly known as Bangalore Weekend Testers is the acronym for Weekend Testers. We are a group of testers who have synergy towards testing software and learning from it. We also belong to the group of testers whose vision is to improve the craft. We are bringing Weekend Testing through our first chapter – Bangalore Weekend Testers, to find people with similar synergy.
Mission of WT
A platform for software testers to collaborate, test various kinds of software, foster hope, gain peer recognition, and be of value to the community.
You should already be aware of it, if not, look into it. Good times.
 
And out of it grew Weeknight Testing.

Weeknight Testing slowed down as we all got busy with life and there hasn't been a session for quite while.

Sharath has been in contact and we're looking at reviving it.

If there is anything you would like covered or if you're interested in running a session let us know.

I think (tbc) we're going to be looking at running it in different ways, not sure exactly yet but I think we'll mix between in person and on-line sessions.

Details on a couple of past sessions:

WNT – Black Box Security Testing

Week Night Testing: Requirements analysis & testing traps

Weeknight Testing #04 – an experience report

Agile Testing UK:Weeknight Testing Live 


  - Live video streamed between Germany, San Francisco and London.

Get involved.


Test. Learn. Contribute.


Cheers

Tony.

Tuesday, 21 February 2012

Testing? Thoughts? Idea? What?


Yesterday I tried a experiment I'd been toying with in my head.

I've been with a organisation for roughly 8mths now and I've not actually had a lot of time to spend with the Testers as we've all been busy and I wanted to know more about how they think and what they think about what they do.

My role has changed slightly now and I have more time to work with the Testers and so yesterday was the first of the 'sessions' I'll be running.

I had 3 Testers on the exercise and essentially just asked them to write down thoughts on testing.

We then discussed what they had written down and wrote it on a whiteboard.

I then added to it with my thoughts which we also discussed.

There were notes being taken and thoughtful nods and comments.

Mine are in red.

Francesco, a colleague who wasn't on the exercise later pointed out that we'd not written anything about 'who'.


What else did we miss?

I think the session was a success as it seemed to get the guys thinking and I learnt about their thinking.

I would like to punch it up a bit, not sure what I could add to jazz it up a little.

Have you run anything similar? Or taken part in something similar? How did you get on?

It might have worked a little better if thoughts had been written the night before and then we got together to discuss as I'm thinking of new stuff to add all the time.