Yahoo no longer test...

Cant remember the last time I even THOUGHT about using Yahoo instead of Google.
 
He said "Cant remember the last time I even "THOUGHT" about using Yahoo instead of Google"

it helps if you "READ" and comprehend the words.

And if you "READ" my original post, you'll see I was talking about flickr.
But hey, why let comprehension get in the way of bitching?

LOL
 
And if you "READ" my original post, you'll see I was talking about flickr.
But hey, why let comprehension get in the way of bitching?

Hey, if you read my post you'll see I quoted Stumpy's post. not yours so "do keep up!"

"apology accepted"
 
Last edited:
Hey, if you read my post you'll see I quoted Stumpy's post. not yours so "do keep up!"

"apology accepted"
Perhaps its you needs to keep up, you actually quoted @Pookeyhead. Rather childishly too.
 
Hey, if you read my post you'll see I quoted Stumpy's post. not yours so "do keep up!"

"apology accepted"
Haha, NO apology proffered. The thread originally referenced Flickr and whilst Stumpy didn't reference Flickr, David was pointing out that Flickr is Yahoo.
Please keep up there under the bridge.
 
Hmm. They may have done away with their QA team and any "manual checking" by humans.
But what the article doesn't say is whether or not there are any automated tests? We advocate that our developers do test driven development where-by they write a unit test first and then right code that satisfies the test.
 
Haha, NO apology proffered. The thread originally referenced Flickr and whilst Stumpy didn't reference Flickr, David was pointing out that Flickr is Yahoo.
Please keep up there under the bridge.


Exactly. Stumpy was saying he never even thinks about using Yahoo. My point was, if he uses Flickr... he IS using Yahoo whether he likes it or not.

@AHILL What's with the aggression? Not getting any lately? Try sitting on your hand for a while.
 
Hmm. They may have done away with their QA team and any "manual checking" by humans.
But what the article doesn't say is whether or not there are any automated tests? We advocate that our developers do test driven development where-by they write a unit test first and then right code that satisfies the test.

TDD and BDD are software engineering practices, not testing. A system can pass all of its unit tests and still not be "good" or even functional.
 
Exactly. Stumpy was saying he never even thinks about using Yahoo. My point was, if he uses Flickr... he IS using Yahoo whether he likes it or not.

@AHILL What's with the aggression? Not getting any lately? Try sitting on your hand for a while.

The only aggression I am seeing is from yourself & your two hangers on.

Also whats with the derogatory remarks?
 
Hmm. They may have done away with their QA team and any "manual checking" by humans.
But what the article doesn't say is whether or not there are any automated tests? We advocate that our developers do test driven development where-by they write a unit test first and then right code that satisfies the test.

As @KitsuneAndy says, all that proves is that the lines of code written by the dev do what is expected in isolation, but not:
1. What the design wanted
2. If it integrates with the rest of the product
3. How it behaves when not tested under the 'happy path'
4. How it performs under load
5. If there are any vulnerabilities
6. etc.

Have a look at "V Model testing", ir maybe "Dual Vee", or if you're from the 80s, the "Waterfall Approach to software testing"
You might want to look at Agile Scrums etc.

Yes, Unit testing is better than nothing, but not all that much.

The reason to do thorough testing early is it is cheaper to fix issues the earlier they are spotted
If you find the issue during design testing, then it is far cheaper than if you find it in User Acceptance because you won't have built/unit tested/system tested/load tested/pen tested/given to your business users, you can see how the cost adds up very quickly.

Mind you, none of this is needed if your developers are perfect ;)
Anyway, this is getting of topic.
 
Last edited:
A warning has been given for this post
The only aggression I am seeing is from yourself & your two hangers on.

Also whats with the derogatory remarks?


You waded in from nowhere, quoted me, and said,
He said "Cant remember the last time I even "THOUGHT" about using Yahoo instead of Google" it helps if you "READ" and comprehend the words."
when I wasn't even talking to you... when you clearly misunderstood what what was being said.

In what way was I being aggressive to you? Can you please point out any aggressive behaviour to you in this thread, prior to you deciding to be rude to someone for no apparent reason?[/QUOTE]

Also whats with the derogatory remarks?

Because you're being a dick.

That's right... you heard me, now run along to the mods and cry like the big girl you are.
 
Last edited:
As @KitsuneAndy says, all that proves is that the lines of code written by the dev do what is expected in isolation, but not:
1. What the design wanted
2. If it integrates with the rest of the product
3. How it behaves when not tested under the 'happy path'
4. How it performs under load
5. If there are any vulnerabilities
6. etc.

Have a look at "V Model testing", ir maybe "Dual Vee", or if you're from the 80s, the "Waterfall Approach to software testing"
You might want to look at Agile Scrums etc.

Yes, Unit testing is better than nothing, but not all that much.

The reason to do thorough testing early is it is cheaper to fix issues the earlier they are spotted
If you find the issue during design testing, then it is far cheaper than if you find it in User Acceptance because you won't have built/unit tested/system tested/load tested/pen tested/given to your business users, you can see how the cost adds up very quickly.

Mind you, none of this is needed if your developers are perfect ;)
Anyway, this is getting of topic.
And you can automate most of those tests as well :thumbs:
 
And you can automate most of those tests as well :thumbs:

Most, but not all. As good as automation is these days, exploratory testing is still best left to people.

My day job is looking after the test automation at a big insurance company.
 
Most, but not all. As good as automation is these days, exploratory testing is still best left to people.

My day job is looking after the test automation at a big insurance company.
Cool. Than you undoubtedly agree that it all depends. It depends on so many different factors which way is best for an organisation, application, commercial arrangements etc.
 
And you can automate most of those tests as well (y)
To be fair, I suggesed
1. What the design wanted
2. If it integrates with the rest of the product
3. How it behaves when not tested under the 'happy path'
4. How it performs under load
5. If there are any vulnerabilities

1. No, - Can't be automated, you have to read the design and check it matches the requirements
2. Yes to an extent. - You'll want to regression test the app to ensure that the changes havn't had any detrimental impact, but the first run through of the integration itself would probably need to be manual. That said, automate as you go along so you can add to the regresion pack / re-run the test on subsequent builds.
3. Probably Not. - You're looking for failures here by doing unusual things (such as resting a book on the enter key for 30 minutes to see what happens, or sitting on the keyboard (both things I've done)), document the tests yes, but automate, probably not.
4. Yes - Absolutely Yes, no other real way. But be aware of the limitations and also have someone manually test on the system whilst it is under load so you can judge how it is performing.
5. - Pen Tests tend to be scripts, but it absolutely needs human intervention to decide what to attack next.

I do agree though, @dejongj how much / little and in what way testing is undertaken is a risk driven commercial decision. There is no way you would test an apple appstore game in the same way you would test a safety critical tool.
I'm interested, what is your involvement in testing (if you say you're a project manager, then the door is that way ----> ;) )
 
Cool. Than you undoubtedly agree that it all depends. It depends on so many different factors which way is best for an organisation, application, commercial arrangements etc.

Yep, we have a huge amount of systems covering all the legacy of our system and the approach on each one is different depending on who the user is, what technology it's on, and the risk appetite of the business involved.
 
To be fair, I suggesed
1. What the design wanted
2. If it integrates with the rest of the product
3. How it behaves when not tested under the 'happy path'
4. How it performs under load
5. If there are any vulnerabilities

1. No, - Can't be automated, you have to read the design and check it matches the requirements
2. Yes to an extent. - You'll want to regression test the app to ensure that the changes havn't had any detrimental impact, but the first run through of the integration itself would probably need to be manual. That said, automate as you go along so you can add to the regresion pack / re-run the test on subsequent builds.
3. Probably Not. - You're looking for failures here by doing unusual things (such as resting a book on the enter key for 30 minutes to see what happens, or sitting on the keyboard (both things I've done)), document the tests yes, but automate, probably not.
4. Yes - Absolutely Yes, no other real way. But be aware of the limitations and also have someone manually test on the system whilst it is under load so you can judge how it is performing.
5. - Pen Tests tend to be scripts, but it absolutely needs human intervention to decide what to attack next.

I do agree though, @dejongj how much / little and in what way testing is undertaken is a risk driven commercial decision. There is no way you would test an apple appstore game in the same way you would test a safety critical tool.
I'm interested, what is your involvement in testing (if you say you're a project manager, then the door is that way ----> ;) )

That is a bit disrespectful, I wouldn't have any of such comments in my teams. All roles are very important in order to arrive at a quality product.

But as it is my name above the door, and I have to provide the explanations, I do make it my business to fully understand all aspects and associated risks.

Techniques have moved on a lot of time. Sometimes the investment in automation it's not worth it. Other times it is absolutely appropriate. It depends. I wouldn't be so resolute with the yes and no as you described above. Given enough time and resources all of the points can be automated. Will it always be feasible? No. Will any form cover 100%? No never. Does it actually matter? No most of the time it doesn't.

As I said before; it all depends.
 
That is a bit disrespectful, I wouldn't have any of such comments in my teams. All roles are very important in order to arrive at a quality product.
What is disrespectful? Asking experience? Or the fact that I cracked a little joke and added a smiley?
But as it is my name above the door, and I have to provide the explanations, I do make it my business to fully understand all aspects and associated risks.

Techniques have moved on a lot of time. Sometimes the investment in automation it's not worth it. Other times it is absolutely appropriate. It depends. I wouldn't be so resolute with the yes and no as you described above. Given enough time and resources all of the points can be automated. Will it always be feasible? No. Will any form cover 100%? No never. Does it actually matter? No most of the time it doesn't.

As I said before; it all depends.
Yes and no to degrees.
I agree, given enough time and effort you could create automation for a lot of the examples I gave, though whether it's worth it and how much coverage it would provide we can only guess at (especially as we are talking hypothetically).
I'm struggling to see how requirements / design testing can be automated, as at the point of defining requirements you don't have any software to test.

Regardless, we're talking semantics here, I think we're all agreed that testing to a level appropriate to the risk (whether that be financial risk, reputational risk, reglatory risk or even risk to life) is a good idea.

The article states that:
The effort was part of a program Yahoo calls Warp Drive: a shift from batch releases of code to a system of continuous delivery. Software engineers at Yahoo are no longer permitted to hand off their completed code to another team for cross checking. Instead, the code goes live as-is; if it has problems, it will fail and shut down systems, directly affecting Yahoo’s customers.
Presuambly Yahoo have weighed the cost against the risk and decided that if it fails and shuts down systems, then that's something they can live with (and presumably the users can go to hell). As a user of flickr I would prefer to see yahoo go from success to success in order that they remain financially stable and I wish them good fortune with their approach.
 
That is a bit disrespectful, I wouldn't have any of such comments in my teams. All roles are very important in order to arrive at a quality product.

But as it is my name above the door, and I have to provide the explanations, I do make it my business to fully understand all aspects and associated risks.

Techniques have moved on a lot of time. Sometimes the investment in automation it's not worth it. Other times it is absolutely appropriate. It depends. I wouldn't be so resolute with the yes and no as you described above. Given enough time and resources all of the points can be automated. Will it always be feasible? No. Will any form cover 100%? No never. Does it actually matter? No most of the time it doesn't.

As I said before; it all depends.

Volume and the number of times it's going to be executed generally being key factors.

We have several frameworks in place depending on the technology involved, what type of application it is, how critical the application is and the risk appetite of the business unit or partner.

Fully automated build, deployment and testing on some systems. BDD / TDD where applicable, usually then combined with a data driven framework to do the end to end functional and regression testing. We can't always deploy straight to live depending on what we're changing due to legal constraints and having to satisfy compliance, but we have the capability to do so. By leveraging the different frameworks and different styles of automation we've gone from 4 major releases to live a year, to monthly releases on our biggest applications and can deploy UI/UX only changes instantly.
 
We do thirty to forty deployments per day. We have no trouble with our SLA of 99.9%.
Our software performs calculations which have to be correct. Automated Unit tests take care of that. We don't stub anything or mock anything.
Local development environments are barely distinguishable from production, so code when exercised performs just the same way on a developer workstation as it does in a live environment.

I say this as an old fashioned Waterfall QA: Manual testing just gets in the way*.
By keeping it simple, we can do more.


*That's not to say peopole don't take ownership of their products and check them once their in production. It' s just the formal part of the process has been dispensed with.
 
Last edited:
We do thirty to forty deployments per day. We have no trouble with our SLA of 99.9%.
Our software performs calculations which have to be correct. Automated Unit tests take care of that. We don't stub anything or mock anything.
Local development environments are barely distinguishable from production, so code when exercised performs just the same way on a developer workstation as it does in a live environment.

I say this as an old fashioned Waterfall QA: Manual testing just gets in the way*.
By keeping it simple, we can do more.


*That's not to say peopole don't take ownership of their products and check them once their in production. It' s just the formal part of the process has been dispensed with.

That all depends though, our environments are huge and need to be stress tested thoroughly, having scalable environments throughout our whole development cycle would be amazing, but cost a fortune.
 
As @KitsuneAndy says, all that proves is that the lines of code written by the dev do what is expected in isolation, but not:
1. What the design wanted
2. If it integrates with the rest of the product
3. How it behaves when not tested under the 'happy path'
4. How it performs under load
5. If there are any vulnerabilities
6. etc.

Have a look at "V Model testing", ir maybe "Dual Vee", or if you're from the 80s, the "Waterfall Approach to software testing"
You might want to look at Agile Scrums etc.

Yes, Unit testing is better than nothing, but not all that much.

The reason to do thorough testing early is it is cheaper to fix issues the earlier they are spotted
If you find the issue during design testing, then it is far cheaper than if you find it in User Acceptance because you won't have built/unit tested/system tested/load tested/pen tested/given to your business users, you can see how the cost adds up very quickly.

Mind you, none of this is needed if your developers are perfect ;)
Anyway, this is getting of topic.


To be honest, it sounds like you are stuck in the mid-nineties.
Requirements gathering sessions lasting days on end.
Functional design.
Technical design.
Peer reviews.
Technical reviews.
Business specified test points.
High level test plans.
Low level test plans.
Packaging
Building
Integration Testing
Testing
Clone testing
Crazy big bang deployments

Typically twelve months of work entailed just two weeks actual coding.
Were the defects eliminated? No. The net result was very expensive software. Only the most severe defects were fixed because fixing them was very expensive.

Contrast that with a shorter process where changes go from back log to production in less than a week. We produce quality software much more efficiently. Defects are easily fixed. It's no panacea but in fifteen years of working in the industry it's fairly close.
 
That all depends though, our environments are huge and need to be stress tested thoroughly, having scalable environments throughout our whole development cycle would be amazing, but cost a fortune.

In that case, you're doing it wrong :) HTH

In all seriousness, I have worked in places where there have been 40 discrete systems each concerned with millions upon millions of records. Ridiculously over complicated, unwieldy and complex. There is a lot to be said for actively finding cheaper and simpler ways of doing things. It's the only way to get better at handling complexity.
 
In that case, you're doing it wrong :) HTH

In all seriousness, I have worked in places where there have been 40 discrete systems each concerned with millions upon millions of records. Ridiculously over complicated, unwieldy and complex. There is a lot to be said for actively finding cheaper and simpler ways of doing things. It's the only way to get better at handling complexity.

It all depends, like previously said. Green field developments are obviously a lot quicker, but we have a lot of legacy systems due to mergers over the years, a lot of legislative requirements to meet in the finance world and hundreds of systems, millions of customers, billions of records, spread out across businesses over 3 continents with their own infosec laws and regulations to meet.

As legacy systems fall out of use, our development cycles speed up. But, there's not a single big bank or finance company that's doing "true" agile, it's just not possible with all of the external constraints we have to work with. But no, 95% of our projects are now agile for the most part, but we have such a large infrastructure that we need to make sure the integration testing and regression in fully integrated environments is done properly and having full size environments within development just isn't feasible unless I convince one of the directors to buy us a new datacentre :D
 
This is going completely OT ...
Without giving out too much detail, we've had to deal with the challenges of legacy code and legacy infrastructure - our company was spun off from a finance company.

It's all perfectly feasible. We have two DCs. They are essentially development environments scaled up to handle the load. We use an n-tier infrastructure and the difference is that each tier is larger, but otherwise the same.
We do provide a separate data tier for our development environment - it's probably IRO 16U.
 
This thread has made me chuckle. A mate of mine works for a leading online hotel booking company and we were out for a beer discussing work when he anounced that his company do no testing (he didn't quantify that) and they just monitor complaints and then fixes on the fly. My jaw dropped given I am a test manager in the financial services industry. I've also done agile testing which heavily relies on automation. I've never worked in the aeronautics industry but I have heard about the level of testing they do for the mission critical systems. Let's face it, I wouldn't like to fly on an A380 if Airbus took the view of the hotel booking company to do their testing.

It's a matter of horses for courses depending on the criticality of the system and the type of tin and wire it's sits on. Basic principles haven't changed much over 20 years but techniques and technology has. Just my thoughts.
 
Last edited:
Back
Top