Sunday, November 29, 2009

What makes a good Software tester?

Software Testers are known for their creativity but at the same time are known as sadistic who handles dull or repetitive work. They do share many qualities as developers, but there is important difference that groups them apart from developers.

The few qualities that define good Software testers from my point of views are:

1. Good to know programming.

This can be debated. As most would assume that testers can be staffed with people who have no technical or
programming knowledge. This is not recommended, even though this is common approach.

Here are my few reasons for this section.

Firstly, Software Testers are testing software’s. So without technical background, they can't have real insight
about the kinds of bugs in the applications and the likeliest place to explore them. As we all agree that testers
will never have enough time to completely test the applications, also they'll compromise with available resources and thoroughness. Therefore testers must optimize scare resources, which mean they should focus on where bugs could be likely found.

Secondly, Testing methods are tools and technology intensive. Testing tools as well as products under testing are all built using technical and programming knowledge. Therefore, testers could be in-capable of using most of the test techniques and will be restricted to ad-hoc techniques.

This doesn't mean that testers should have programming training or have worked as programmers, but understanding the logical techniques and experience will be the easiest way to meet the "know programming" requirements.

2. Know the software application.

This is another side of the knowledge coin. The ideal testers should have an insight into how users will interact and exploit the applications features, and also the kinds of errors users are most likely to make. In reality, it is not possible for testers to know both the applications under test and complete programming language, they should be a compromise between the application and the logical architecture.

For example, for testing demand generation software, testers should know how marketers would use the product to automate lead generation and calculate ROI or for online retail order software, testers should know how users could exploit the online security.

3. Practice Intelligence.

There are many researchers conducted to determine what would make ideal testers and the common conclusions made with these researches are that, there are no single bench marks to predict ideal testers. Good testers are also smart people; the single most important quality for ideal testers is raw intelligence.

4. Be Hyper-Sensitive to little things.

Good testers notice little things that others (including programmers) miss or ignore. Testers see symptoms, not just bugs. We know that a given bug can have many different symptoms, ranging from trivial to catastrophic. We know that the symptoms of a bug are in turn related in severity to the cause. Consequently, there is no such thing as a minor symptom-because a symptom isn't a bug. It is only after the symptom is fully explained that we have the right to say if the bug that caused that symptom is minor or major. Therefore, anything at all out of the ordinary is worth pursuing.

For example, the screen flickered this time, but not last time-this is a bug. The report generated is off by 0.01%-great bug. Good testers notice such little things and use them as an entree to finding a closely related set of inputs that will cause a catastrophic failure and therefore get the programmers' attention.

5. Be Tolerant for Chaos.

People react to chaos and uncertainty in different ways. Some cave in and give up while others try to create order out of chaos. If the tester waits for all issues to be fully resolved before starting test design or testing, he/ she won't get started until after the software has been shipped. Testers have to be flexible and be able to drop things when blocked and move on to another thing that's not blocked. Testers always have many unfinished tasks during SDLC. In this respect, good testers differ from programmers. The testers' world is inherently more chaotic than the programmers'.

6. Practice people Skills.

Here's another area in which testers and programmers can differ. You can be an effective programmer even if you are hostile and anti-social; that won't work for a tester. Testers can take a lot of abuse from outraged programmers. A sense of humor and a thick skin will help the testers to survive. Testers may have to be diplomatic when confronting a senior programmer with a fundamental goof. Diplomacy, tact and a ready smile, all works to the independent tester's advantage.

7. Tenacity.

An ability to reach compromises and consensus can be at the expense of tenacity. That's the other side of the
people skills. Being socially smart and diplomatic doesn't mean being indecisive or a limp rag that anyone can walk all over. The best testers are both-socially adept and tenacious where it matters. Good testers can't be
intimidated even by pulling his/her rank. They'll need high-level backing, of course, in order to sign-off the
quality software.

8. Be Organized.

I can't imagine a scatter-brained tester. There's just too much to keep track of to trust the memory. Good testers use files, data bases and all the other characters of an organized mind. They make up checklists to keep themselves on track. They recognize that they too can make mistakes, so they double-check their findings. They have the facts and figures to support their position. When they claim that there's a bug-believe it, because if the developers don't, the tester will flood them with well-organized, overwhelming evidence.

9. Be Skeptical.

That doesn't mean hostile, though. I mean skepticism in the sense that nothing is taken for granted and that all is fit to be questioned. Only tangible evidence in documents, specifications, code, and test results matter. While they may patiently listen to the reassuring, comfortable words from the programmers ("Trust me. I know where the bugs are."), and do it with a smile, they will ignore all such in-substantive assurances.

10. Be Self-Sufficient and Tough.

If testers need love, they don't expect to get it on the job. They can't be looking for the interaction between
them and programmers as a source of ego-gratification and/or nurturing. Their ego is gratified by finding bugs, with few misgivings about the pain (in the programmers) that such finding might engender. In this respect, they must practice very tough love.

11. Be Cunning.

Systematic test techniques such as syntax testing and automatic test generators have reduced the need for such cunning, but the need is still with us and undoubtedly always will be because it will never be possible to
systematize all aspects of testing. There will always be room for that offbeat kind of thinking that will lead to a
test case that exposes a really bad bug. But this can be taken to extremes and is certainly not a substitute for
the use of systematic test techniques. The cunning comes into play after all the automatically generated "sadistic" tests have been executed.

12. Be Technology hungry.

Good testers hate dull, repetitive work. They'll do it for a while if they have to, but not for long. The silliest
thing for a human to do, in their mind, is to pound on a keyboard when they're surrounded by computers. They have a clear notion of how error-prone manual testing is, and in order to improve the quality of their own work, they'll find ways to eliminate all such error-prone procedures.

I've yet to meet a tester who wasn't hungry for applicable technology. When asked why didn't they automate
such and such-the answer was never "I like to do it by hand." It was always one of the following:
(1) "I didn't know that it could be automated"
(2) "I didn't know that such tools existed"
(3) or worst of all, "Management wouldn't give me the time to learn how to use the tool."

13. Be Honest.

Testers are fundamentally honest and incorruptible. They'll compromise if they have to, but they'll righteously
agonize over it. This fundamental honesty extends to a brutally realistic understanding of their own limitations as a human being. They accept the idea that they are no better and no worse, and therefore no less error-prone than their programming counterparts. So they apply the same kind of self-assessment procedures that good programmers will. They'll do test inspections just like programmers do code inspections. The greatest possible crime in a tester's eye is to fake test results.



Reference: onestoptesting.com

Tuesday, November 24, 2009

CAPABILITY MATURITY MODEL (CMM)

CMM describes software process management maturity relative to five levels:

ie., Initial, Repeatable, Defined, Managed, Optimizing

In the Initial level there is a lack of planning and the development of a clear-cut guide that software development teams can follow. Few details of a software process have been defined at this level. Good results are considered miraculous.

In the Second level ie., the CMM Repeatable Process is characterized by a commitment to discipline in carrying out a software development project. And is achieved by : Requirements management, software projects planning, software project tracking and oversight, software subcontract management, software quality assurance, software configuration management.

In the Third level ie., the CMM Defined Process is to guide the structuring and evaluation of a software project. And is achieved by : Organisational process focus and definition, training program, software product engineering, intergroup coordination, peer reviews.

In the Fourth level ie., the CMM Managed Process is for data gathering and analysis and managing software quality, and is achieved by : Quantitative process management, software quality management.

In the Fifth level ie., the CMM Optimizing Process is associated with defect prevention, automation of the software process wherever possible, and methods for improving software quality and team productivity and shortening development time.

Tuesday, November 17, 2009

“Testing” and “Quality Assurance”


Every company has their own functional and organizational uses for the terms, “Testing”, and “Quality Assurance”. To be fair, the actions that are done are more important than what they are called. It is common for Testing activities to be subsets of a larger Quality Assurance Life Cycle.

While Quality Assurance sets out the framework for the implementation of quality in the development and implementation of information technology projects, it is Testing that identifies the impact of Quality Assurance, or the lack of impact, prior to implementation. It is Testing all through the project life cycle that quickly identifies defects, allowing for appropriate timely corrective action.

Not long ago, ad hoc testing was believed to be sufficient and the most junior staff were assigned to test code. With the increased complexity of today’s systems and shorter timeframes to deliver results, the industry no longer believes that to be true. Today’s accepted approach is more formal, evaluating all requirements, considering the associated risks, and then creating a test plan that satisfies not only the project team but the business sponsor. These items of requirement evaluation, risk assessment, and test plan creation are all Testing components of a Quality Assurance Life Cycle.

Testing activities should start at the project kickoff meeting with an early assessment of testing needs. Technical requirements to be tested, such as system performance, must be understood to allow the design to meet those technical requirements, to allow the infrastructure to be sufficiently robust, and to allow test cases to be built to challenge the requirements. Business requirements need to be clearly stated in a manner that makes them “testable.” All constraints need to be stated so that test cases can be formed to examine the results of hitting and exceeding those constraints.

Testing at all phases of the project is critical. It is common knowledge that defects found and corrected early in the life cycle cost the organization far less to correct than those found later on. It also takes less time to correct a defect early in the project than it does as the project progresses. It only makes sense to have Testing be part of the project process right from the beginning.

In order to have the skills to design and execute a formal test plan, formal education and experience is required. The International Software Testing Qualification Board (ISTQB) provides an internationally acknowledged progressive set of examinations to prepare testers and test managers to meet the demands of today’s projects.

If there is one true statement in the Information Technology field, it is that we must always be learning and improving. Testing is no exception and is now a more respected and important part of the industry. A strong test organization made up of qualified staff has become a requirement for a successful Quality Assurance initiative and for project success.


Source: www.cstb.ca

Thursday, November 12, 2009

Firebug as a dedugging tool

Things that are inevitable in Software Development are bug, bugs, more bugs and Java Script is no exceptions to this rules.

QA/ Testers and developers need tools to help them locate these bugs, and such tool would be Firebug. In very short time, this has become popular among Testers/ QA's and Java Scripts development communities. Firebug is free, open source project and works as plug- ins for firefox browser. This tool was created by Joe Hewitt, co- founder of firefox browser.

Firebug is a debugger in the traditional sense. It lets you pause program execution, step through code line by line, and access the state of variables at any time. It can also examine entire DOM(HTML and CSS), view built-in browser details and easily inspect anything on the page simply by clicking on it. It also includes powerful tool to monitor network traffic. All these goodies are placed within this compact interface.

One of its functionality that amazes me is that, it can identify every single objects on the webpage with useful information like size of the objects and time to load each objects. It also displays the url's and previews of the objects.

Firebug not only lets find and fix the bug in the code, it is also a suitable tool for exploration of web applications. It can help to discove how development teams Java Script works. Exploring others application can be powerful educational experience.




Source: getfirebug.com

Tuesday, November 10, 2009

Why entry-level programmers should not be placed in Test organization?

I don't like the idea of taking entry-level programmers and putting them into a test organization because:

(1) Loser Image.

Few universities offer undergraduate training in testing beyond "Be sure to test thoroughly." Entry-level people expect to get a job as a programmer and if they're offered a job in a test group, they'll often look upon it as a failure on their part: they believe that they didn't have what it takes to be a programmer in that organization. This unfortunate perception exists even in organizations that values testers highly.

(2) Credibility With Programmers.

Independent testers often have to deal with programmers far more senior than themselves. Unless they've been through a coop program as an undergraduate, all their programming experience is with academic toys: the novice often has no real idea of what programming in a professional, cooperative, programming environment is all about. As such, they have no credibility with their programming counterpart who can sluff off their concerns with "Look, kid. You just don't understand how programming is done here, or anywhere else, for that matter." It is setting up the novice tester for failure.

Monday, November 2, 2009

Top 10 strategic Technologies for 2010

Ever wondered which would be the strategic technologies that have the potential to significantly impact an organisation's growth in the next three years? The technologies that would be at the forefront of an organisation's long-term plans, programmes and initiatives.

Research firm Gartner has listed top 10 strategic technologies that will help organisations transform and grow.

However, according to David Cearley, vice president and distinguished analyst at Gartner, "This does not necessarily mean adoption and investment in all of the technologies. They should determine which technologies will help and transform their individual business initiatives.”

Here's over to the top 10 strategic technologies for 2010

Cloud computing

Cloud computing is a style of computing that characterizes a model in which providers deliver a variety of IT-enabled capabilities to consumers. Cloud-based services can be exploited in a variety of ways to develop an application or a solution. Using cloud resources does not eliminate the costs of IT solutions, but does re-arrange some and reduce others. In addition, consuming cloud services enterprises will increasingly act as cloud providers and deliver application, information or business process services to customers and business partners.

Advanced analytics
Optimization and simulation is using analytical tools and models to maximize business process and decision effectiveness by examining alternative outcomes and scenarios, before, during and after process implementation and execution. This can be viewed as a third step in supporting operational business decisions. Fixed rules and prepared policies gave way to more informed decisions powered by the right information delivered at the right time, whether through customer relationship management (CRM) or enterprise resource planning (ERP) or other applications.

The new step is to provide simulation, prediction, optimization and other analytics, not simply information, to empower even more decision flexibility at the time and place of every business process action. The new step looks into the future, predicting what can or will happen.

Client computing
Virtualization is bringing new ways of packaging client computing applications and capabilities. As a result, the choice of a particular PC hardware platform, and eventually the OS platform, becomes less critical. Enterprises should proactively build a five to eight year strategic client computing road-map outlining an approach to device standards, ownership and support; operating system and application selection, deployment and update; and management and security plans to manage diversity.

IT for green
IT can enable many green initiatives. The use of IT, particularly among the white collar staff, can greatly enhance an enterprise’s green credentials.
Common green initiatives include the use of e-documents, reducing travel and teleworking. IT can also provide the analytic tools that others in the enterprise may use to reduce energy consumption in the transportation of goods or other carbon management activities.

Reshaping the data center
In the past, design principles for data centers were simple: Figure out what you have, estimate growth for 15 to 20 years, then build to suit. Newly-built data centers often opened with huge areas of white floor space, fully powered and backed by a uninterruptible power supply (UPS), water-and air-cooled and mostly empty.

However, costs are actually lower if enterprises adopt a pod-based approach to data center construction and expansion. If 9,000 square feet is expected to be needed during the life of a data center, then design the site to support it, but only build what’s needed for five to seven years. Cutting operating expenses, which are a nontrivial part of the overall IT spend for most clients, frees up money to apply to other projects or investments either in IT or in the business itself.

Flash memory

Flash memory is not new, but it is moving up to a new tier in the storage echelon. Flash memory is a semiconductor memory device, familiar from its use in USB memory sticks and digital camera cards. It is much faster than rotating disk, but considerably more expensive, however this differential is shrinking. At the rate of price declines, the technology will enjoy more than a 100 percent compound annual growth rate during the new few years and become strategic in many IT areas including consumer devices, entertainment equipment and other embedded IT systems.

In addition, it offers a new layer of the storage hierarchy in servers and client computers that has key advantages including space, heat, performance and ruggedness.

Mobile applications
By year-end 2010, 1.2 billion people will carry handsets capable of rich, mobile commerce providing a rich environment for the convergence of mobility and the Web. There are already many thousands of applications for platforms such as the Apple i-Phone, in spite of the limited market and need for unique coding. It may take a newer version that is designed to flexibly operate on both full PC and miniature systems, but if the operating system interface and processor architecture were identical, that enabling factor would create a huge turn upwards in mobile application availability.

Courtesy: OnestopTesting.com

Search in this page

References: Some of the contents may have reference to various sources available on the web.
Logos, images and trademarks are the properties of their respective organizations.