Best practices

Experiences, small talk, and other automation gossip.
User avatar
iskrenpp
Posts: 11
Joined: Fri Aug 17, 2012 4:36 am

Best practices

Post by iskrenpp » Fri Nov 01, 2013 8:56 pm

All,
I have been using Ranorex for the past year and would like to share some of my best practices that resulted from that:
- always use CSV files for data sources - reason: they provide flexibility in terms of getting updates outside of your project by someone that does not even have access to the solution source code BUT also they do not require external software like Excel to view and save them like when you use actual Excel spreadsheets. ADDITIONALLY and most importantly after your solution grows saving and updating the test suite becomes slower and slower - a bug that Ranorex will address pretty soon - but that is irrelevant if spreadsheets are not used - after all the fewer dependencies the Ranorex solution has the better - meaning fewer chances to break because of these dependencies.
Last edited by Support Team on Tue Nov 05, 2013 8:37 am, edited 1 time in total.
Reason: All capitals (=shouting) in subject

carsonw
Posts: 178
Joined: Tue Nov 08, 2011 10:01 pm

Re: BEST PRACTICES

Post by carsonw » Fri Nov 01, 2013 9:35 pm

If you are using CSVs how do you deal with things like:

* Using referenced data (I set a value in one column during one iteration in the course of my test, and reference that value in another iteration during the course of my test)
* Readability - if I have a very large and complex test, how do you deal with ease of readability in a CSV file?
* Ease of Updating - If you have manual QAs, for example, who want to update test data, trying to decipher a large CSV can be a little more complex
* Formulas - What if you have calculations in your test data? Yes you can calculate your items in your code, but isn't it preferred to have the test data contained in and of itself (so technically a manual tester could pick up your test data and run the test without any additional information and/or consulting the code).

Also - we've not experienced any slow downs based on the size of data, except related to the bug introduced in 4.1.1 that was fixed in 4.1.2.

Just curious about your own experiences :)

User avatar
iskrenpp
Posts: 11
Joined: Fri Aug 17, 2012 4:36 am

Re: Best practices

Post by iskrenpp » Fri Nov 22, 2013 5:43 am

HI,
honestly I have not dealt yet with updating the CSV at run-time and using the updated column. When I need to store something persistently ONLY during run-time I always use global parameters as they save constants only during run time. When new test run is initiated the global parameter start again from its original manually set value. If I need something stored permanently and then reused either in same test run or a whole another independent test run I simply save this in a text file that is not referenced as any data source to any test case but instead is a standalone text file for storage. This is beneficial especially when I create accounts to keep the last used number as every account must be unique and external text file can be updated at run time with last used number for account creation and then picked up for the next test run without any usage of duplicate account creation credentials.
Currently I have a CSV with 11 columns and keeping the first row as a column header helps a lot. Then adding or finding the needed data in complex CSV will only need to count on the row which column number the data you need is in. Using of course text editor that shows line numbers is also of a great help together with 'Find' functionality.
Regardless test data with calculations - I have never seem or used anything like that - I am not even sure how that can be done - unless you mean that you save the formulas as a string in the CSV.
And lastly - yes - the slowness that I was referring to was the bug that you mention - even if that bug is fixed now still using excel spreadsheets require excel viewer at least to e on execution machine - with CSV you need nothing else. Besides you can still open CSV with Excel and all data will be nicely displayed in columns and editing /reading it will be times easier/better.

User avatar
iskrenpp
Posts: 11
Joined: Fri Aug 17, 2012 4:36 am

Re: Best practices

Post by iskrenpp » Fri Nov 22, 2013 5:54 am

The Power of PopupWatcher class.

I want to talk about the power of popup watcher class. As most Ranorex users should know, this is a very powerful option to create and control separate threads that constantly monitor in the background for event in question. The original idea of them is to handle unexpected dialogs BUT I have started also to incorporate their beneficial use also in handling other interactive UIs that are expected instead of unexpected. I test games that often have interactive element in addition to the main game play. In order to extract handling of this additional interactive interaction I simply have a set of different popup watchers that are setup to watch for specific elements that are present only in the game in question and trigger some sort of interactive picking when they find it. This helps with concentrating on test case writing for main game play testing while keeping in mind that any additional interactive component whether triggered intentionally or unintentionally is already take care of through the game specific popup watchers.
The only important step is to track down the element path that triggers the popup watcher (one that it is looking for) to be very unique - meaning that as soon as the interactive part should stop the triggering element also should not be found anymore - thus not triggering the popup watcher again.
Additionally starting a popup watcher once from one module - it stays started for the remainder of the test run regardless if other test modules are executed. This eliminates the worry to start it for each code module - something that I initially was not sure of and I did start them for every code module that I used them for.