I am working in a startup and here I am the only tester. I have a web app for testing. My first approach is as follows:

testing the core functionality

user experience and client experience

Multiple browser testing

check breadcrumb

pagination testing

valid and invalid data entries

console errors & log errorsWhat are the other factors should I be manually testing?
Note: This is not a secure, performance driven, or api like application.

I want to become more of an expert in testing. I got the information from a senior tester to find the hidden bugs and also he can find more than 100 bugs within 15 minute. I want to be able to have test skills like that.
How do I improve my testing skills to that level?

3 Answers
3

a senior tester can find more than 100 bugs within 15 minute. I want
to be able to have test skills like that.

No, don't. Finding lots of bugs in short time periods probably means you will report a lot of trivial bugs. These will waste a lot of time in the whole process.

The (agile) testing manifesto suggests to focus on building the best system over breaking the system. I can break application in such ways that I can report it as a defect, but most often these are routes real users will never take. Why should I report defects users probably will never run into? Test that users will be happy with the system instead.

Just a like a doctor says when you tell him it hurts when I push here. The doctor will say then don't push there. This is the same with testing.

I've often been the sole tester working for a startup, testing a web application and I loved it but it also required a lot of focus on what's important for a given release.

Learning to be a better tester is a very long, open-ended discussion but let me try to summarize some things that might help you now:

Your testing strategy (how you organize your testing) will be dictated by each release and the potential failures (or risk) for that release. Each is likely to be different. For one release you might focus on performance testing, another release you might focus on functionality testing or scenario testing, etc and yet another might be about bug fixes and so you'll focus on regression testing.

Good testing is about finding information. What is your information objective for your release? Talk to people about the release, what are we most worried about in this given release? Then search for information related to that objective.

Test documentation should be lean. It should help you to test but not become so difficult to maintain that it takes time away from testing.

Automation can help improve your efficiency. Not just with functional testing (regression testing) but also with creating test data, etc. Your programmers can definitely help you get comfortable with this.

In general when testing you can consider the following things:

Coverage. How much of something has my testing covered?

Activities. How did you test things? Did you use performance testing, regression testing, scenario testing, mobile testing, security, etc?

Risks. Why are you testing, what failures are you looking for?

Testers. Who does the testing? Subject Matter experts, beta users?

These 4 areas are important in helping you manage what you do and who does it. Since you are the sole tester knowing when someone else can help out, like a business person helping test certain features (as subject matter experts or as the person who requested the feature) might help free you to do more important or specific testing.

Testing is always about trade offs. We don't have time to test everything so which important things should we focus on in the time we have right now?
Focus on the higher risk items first and then lesser risked items afterwards.

Use case testing (This is the client/user focus) - should match up with original specifications provided by the client/user (this could be tickets or requirements or anything that defines what should exist, but in a workflow like format)

Usability testing (Including Lorem Ipsum http://generator.lorem-ipsum.info/ on text areas to ensure readability, there is 508 testing for disability, there is also color schemes, control ease of usefulness, page load times for user eye awareness(partial page load gets visual quicker if there is an issue), ease of understanding, help materials, navigation, pagination, etc...

Data Integrity testing which involves inputs and reads of specific data as well as data manipulation within the application

-1 There is no standard terminology across testing (and broad disagreements when it comes to it) so learning someone else's definitions is not a great place to start.
– Chris KenstMar 28 '17 at 16:40

They don't have any if you read his post...besides these are general categories you can google, not company specific If you want to be a good test engineer you need to understand industry terms, not company specific terms. It goes out the door when you swap companies and the definitions swap around depending on who is saying it, but not even knowing industry terms to talk about it wouldn't get you past the interview.
– muttMar 28 '17 at 16:41

1

+1 to Mutt. This is just the sort of answer I would give. This answer is NOT an attempt to provide a canonical list. It provides a general list of general categories that are familiar to all of us in the software testing industry. I'd focus on the boundary, data integrity and security based on your information of what you've covered already.
– Michael DurrantMar 28 '17 at 19:46

To add to that, I think it is a good idea to start creating some sort of standard terminology across testing. If its never created, there will never be any standard terminology.
– RenzeeeMar 29 '17 at 14:56