2014-09-17

создание страницы

New page

{{TopMenu/ru}} {{Menu/ru}} {{Menu.QA/ru}} {{Lang|Руководство пользователя MozTrap}} [[Category:QA/ru]]

= Introduction =

This guide describes step by step how QA people could do some basic tasks in [[MozTrap]].

= Registration and Login =

[[File:login.png|right|thumbnail|Figure 1: Login MozTrap]]

To be able to start with the tests you will need to own a user account. MozTrap supports Mozilla Persona login and native login. You are free to use any of them at any time. Email is the only thing will be aware to identify your account.

== Login with OpenID ==

After you have selected this alternative on the [http://manual-test.libreoffice.org/users/login/?next=/manage/cases/ login page]:

=== Login with Wordpress account ===

To use this possibility you have to log in to Wordpress before.

Then simply insert your wordpress-id like ''http://MyName.wordpress.com'' into the OpenID pane and click ''ok''

== Login with Mozilla Persona ==

[https://login.persona.org/ Mozilla Persona] is a generic way of login, not only restrictively available for MozTrap but also other web applications. Assume you do already know what Persona is and want to login MozTrap with it, simply click the ''Sign in'' button (''figure 1'') and follow the instruction in the pop-up window.

== Native Registration to MozTrap ==

By native login method, you need to [http://manual-test.libreoffice.org/users/register register to MozTrap]. After a registration, a confirmation mail will be sent to your registered email box, click the link inside to activate the account.

== Default permission ==

With any of the login method above, the Libreoffice community automatically assign your account as a [[QA/Testing/Test Cases Contribution|Tester]]. You can login in your account [http://http://manual-test.libreoffice.org MozTrap] and start to test without any special permission request.

= Run Tests =

This section will give a detailed steps to show you how to run a test in MozTrap.

== Select Test Run and Environment ==

[[File:Select test run 1.png|right|thumbnail|Figure 2: Select test run and configure environment]]

# In the Run Tests tab, select proper:

#* Product - usually Libreoffice is the only choice

#* Versions - usually it should be the latest version, which requires urgent testing

#* Runs - choose either regression test or feature test you are interested in running

# In the "SET YOUR ENVIRONMENT" section, select proper environment elements reflecting your real testing environments

#* Arch

#* Locale

#* Platform

# Press "Run tests in ..." button to proceed.

{{clr}}

== Select Test Case ==

[[File:Select testcase.png|right|thumbnail|Figure 3: Selecting and executing test cases]]

As shown in ''figure 3'', MozTrap has a integrated UI to view and execute test cases in a selected test run.

In the interface, it is easy to see all test cases are listed as rows. Clicking any of the rows will expand the detailed description and executing UI of a specific test case, namely make it selected.

As mentioned above, by design MozTrap shows of all test cases in a paginated way. So it is important to inform that MozTrap allows testers to filter test cases among the large test case set in a particular test run. A common method is to filter test cases by clicking ''tags'' appended to the test case ''Name''. For example, clicking ''p1'' tag will make all p1 test cases shown and clicking ''writer'' tag will make all writer test cases filtered out. A more complicated filtering can be done either by typing in the "RUN TESTS" filter box or clicking ''Advanced Filtering>>'' button located right on the top of the test case list.

In addition, you may notice some of the test cases were already marked as ''PASSED'' or ''FAILED'' in their ''Results'' field, although you never run them before. Actually it means the test cases have been run by other people in the exactly same environment as you are in, you probably do not want to repeat the exact same test execution. Therefore those test cases having ''Pending'' status in the ''Results'' field would usually be more worthy of running.

Finally ''p1'' (high priority) test cases are the most important and basic functions a Libreoffice version could provided. Testers are encouraged to finish ''p1'' test cases in most of the testing phases and scenarios.

{{clr}}

== Execute Test and report results==

After selecting a test case by expanding a test case row, you will see instructions describing how should a test be executed. Simply following up the steps in the description and record the results you observed is all you need to do here. To record different kinds of results, you will want to click corresponding buttons as well as writing some short comments:

* '''PASS TEST''' button - click it when all of the steps described in a test case is passed

* '''FAIL STEP''' button - click it when a particular step described in a test case is failed, which will make the entire test case failed. You will have to input the explanation of how it failed in the subsequent text area. Optionally, you can also give a URL to Bugzilla where a bug document for the issue is reported.

* '''Invalid''' button - it appears right next to the ''PASS TEST'' button as a exclamation mark in a yellow background. Click it when a test case carries obscure steps, invalid content or considerable misunderstanding. Test case managers will need to update those test cases according to your report. Also a brief explanation is a compulsory field to fulfill when you want to mark a test case ''Invalid''.

{{clr}}

= Create New Test Cases =

==Get authentication==

In the context of MozTrap, to create new test cases directly in it, you will need at least to be a [[QA/Testing/Test Cases Contribution|Test Creator]]. Besides we have other [[QA/Testing/Test Cases Contribution|ways]] to accept your new test cases, but they are so far out of this topic.

== Add a new test case ==

[[File:Create testcase 1.png|right|thumbnail|Figure 4: Interface for test case management]]

[[File:Create testcase 2.png|right|thumbnail|Figure 5: Interface for creating a single test case]]

As ''figure 4'' indicates, navigating to ''Manage->Cases'' tab will show you the layout to handle test cases.

To add a single test case, click the ''+ create a test case'' button near the top right of the page. Then a creating test case page(''figure 5'') will allow you to input required information. The following describes some basic ideas behind fields to be filled:

* Product - The product that owns this test case.

* Version - The product version of this test case.

* And Later Versions - Create a test case version for the specified Product Version as well as a case version for each later Product Version. (e.g.: if Product Versions 3, 4 and 5 exist for this Product, and you have specified Product Version 4, this case will be created for versions 4 and 5)

* Suite - The existing suite to which you want this case to belong. You can also add cases to suites later.

* Name - The summary name for the case.

* Description - Any description, pre-conditions, links or notes to associate with the case. This field is displayed while running the test. [http://daringfireball.net/projects/markdown/syntax ''Markdown syntax''] is supported.

* Add Tags - Enter tags to apply to this case. Hit enter after each tag to see the tag chicklet displayed. Auto-completes for existing tags.

* Add Attachment - You can attach files to cases that may help running the test. (e.g: images, audio, video, etc.)

* Instruction / Expected - The test instruction and corresponding expected result. You can choose to put all instructions / expectations in one step, or break them down to individual steps. When running the test, you will have the option to fail on specific steps, so you may find this a better approach. [http://daringfireball.net/projects/markdown/syntax ''Markdown syntax''] is supported.

After filling up all the required fields, click ''save test case'' button to save it. By default the test case is activated to direct use. Alternatively you could mark the test case as a 'disabled' or 'draft' document if further complementary will be needed.

== Add bulk test cases ==

[[File:Create testcase 3.png|right|thumbnail|Figure 6: Interface for creating bulk test cases]]

As ''figure 4'' indicates, navigating to ''Manage->Cases'' tab will show you the layout to handle test cases.

Bulk test cases creating time have no big conceptually difference from adding a single test case. Say when you have dozens of test cases, instead of clicking ''+ create a test case'' button again and again to add them one by one, bulk test cases creating provides a way to allow you create many test cases in a single page and submit them conveniently.

Reviewing ''figrure 3'', to add a bulk test cases, click the ''++..'' button on the left side of ''+ create a test case''. Then a bulk test case creation page will be shown like that appears in ''figure 6''. The most distinct field is you need to follow some syntactical rules to organize those cases so that MozTrap is able to understand how to split them into single test cases and their steps. Currently [http://moztrap.readthedocs.org/en/1.0.X/userguide/ui/import.html#bulk-test-case-entry-formats Gherkin-esque] format is the only syntax got supported.

= Maintain Test Cases =

== Get authentication ==

In the context of MozTrap, to edit, clone and delete test cases directly in it, you will need at least to be a [[QA/Testing/Test Cases Contribution|Test Manager]]. Besides we have other [[QA/Testing/Test Cases Contribution|ways]] to accept your test case update suggestions, but they are so far out of this topic.

== Edit, select environment, clone, and delete test cases ==

After authenticated as a test manager, you will be able to see the test case management icons, which are located on the left side of test cases summary, in the view shown in the ''figure 4''. From left to right, they are ''edit'', ''select environment'', ''clone'' and ''delete''.

* '''Edit''' - allows you to update existing test cases.

* '''Environment''' - allows you to narrow down the list of environments for a given test case. With this function, you can uncheck any environments that you do not want to apply the test case. You can also add environments back in that may have been previously removed. Just check or uncheck items to include or exclude them. Environment is an interesting topic, official [http://moztrap.readthedocs.org/en/1.0.X/userguide/model/environments.html MozTrap manual] is helpful to thoroughly understand it.

* '''Clone''' - allow you to make a hard copy of existing test cases.

* '''Delete''' - allow you to remove a test cases thoroughly in all the versions it belongs to.

= Review testing results =

[[File:Viewresults testruns.png|right|thumbnail|Figure 7: View results for all test runs]]

[[File:Viewresults testcases per run.png|right|thumbnail|Figure 8: View all results for a specific test run]]

[[File:Viewresults testcases all.png|right|thumbnail|Figure 9: View results for all test cases]]

A Test Result stores the results of a single execution of one test case from a test run, in a particular environment, by a particular tester.

Reviewing testing results is straightforward.

To have an overview for the results of a test runs, as seen from the ''figure 7'':

* Navigate to ''View Results'' tab

* Click [http://manual-test.libreoffice.org/results/runs/ Test Runs] in the submenu

The Pass/Fail/Unclear icons as well as statistics will be shown in the last column of the view. You can click the triangle icon at the beginning of each test run to expand more detailed information, in which clicking ''See related test cases'' button is supposed to bring you to all testing results of particular test cases in that test run (''figure 8'').

Alternatively, to have an overview for the results of all test cases, as seen from the ''figure 8'':

* Navigate to ''View Results'' tab

* Click [http://manual-test.libreoffice.org/results/cases/ Test Cases] in the submenu

The Pass/Fail/Unclear icons as well as statistics will be shown in the last column of the view (''figure 9''). You can click the triangle icon at the beginning of each test run to expand more detailed information, in which clicking ''See related test results'' button is supposed to bring you to all testing results of the particular test case. Also, when the case has been executed, you can click the Pass/Fail/Unclear icons navigating to the exactly same page.

Show more