You can use following in your IF Controller to check if the file exists
You can use following in your IF Controller to check if the file exists
Please help me out for this problem.
I tried to close this Pop-up page with Selenium, but not able to do it.
<div class="popupsignup-disp"> <div class="popupsignup-cntbox"> <div class="container-fluid"> <div class="row"> <div class="col-xs-10 col-xs-offset-1 col-md-10 col-md-offset-1 ntp-cnt-wrap"> <div class="row"> <div class="col-xs-12 text-center voffset4"><img alt="" src="/static/img/nico-tides-moon.png"></div> <div class="col-xs-12 voffset3 hidden-xs hidden-sm"></div> <div class="col-xs-12 voffset2"> <h3 class="npr">This full moon’s extra special, and we’ve got prices to match.</h3> </div> <div class="col-xs-12 voffset6"> <h5 class="text-uppercase head-text">Special Pricing goes live</h5> </div> <div class="col-xs-12 voffset3"> <div class="row"> <div class="col-xs-6 text-right potd-wrap"> <div class="pop-time"><span>Online</span> <span class="hlgt">14 + 15 December</span> <span class="small">All Day</span></div> </div> <div class="col-xs-6 text-left"> <div class="pop-time"><span>In-store</span> <span class="hlgt">14 - 18 December</span> <span class="small">11 am To 8 pm</span></div> </div> <div class="pop-divider"></div> </div> </div> <div class="col-xs-12 voffset5"><a class="text-uppercase pop-btn" href="/catalogue/category/special-price/women_126/" id="ntnc-linknclose">Shop now</a></div> </div> </div> </div> </div> </div> <a class="close-popupsignup-wrap"></a> </div>
If we separate quality into the three aspects of software quality: Functional, Process and Structural. I think the Agile methodology has a clear impact on all three. I will try to explain by quoting some relevant Agile princibles.
At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behaviour accordingly.
Most Agile frameworks like Scrum use retrospectives to continuously improve process quality.
Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.
Getting continuous feedback should result in better functional quality as users explain what works, what doesn't and what works unexpected. As you deliver working software frequently you can measure the functional quality better and adapt faster, compared to a slow release-cycle like once a year.
Continuous attention to technical excellence and good design enhances agility.
Most Agile frameworks (except Scrum) advocate clear technical practises. Practises like TDD, pair programming, continuous delivery, clean code and others all lead to higher structural quality.
The eXtreme Programming Agile framework has some good starting rules when it comes to coding that are advisable to improve not only your agility, but also your overal product quality.
This just scratches the surface of what I think what Agile brings to table to improve quality of a software product. Just keep in mind that Scrum lacks some key technical practises and that you should combine it with XP and or the LeSS practises. Without these practises you will most likely only improve your process quality.
I am recently selected in interview panel for Automation Engineers. As i have never taken interviews till now, i am confused about how to judge a candidate? What kind of question should i ask to him and what i should expect?
Question asked here is something different, He is more into how should i interview if candidate doesn't into testing but he is a coder(exact phrase from question : His question is "besides some basic programming questions, maybe a single very basic testing question, what else should I ask ?") and creating frameworks but my question is if a person knows both testing and fundamentals of automation as well , as per my understanding a test engineer should be aware of manual testing process as well even if candidate is more into automation, What should i ask him and how can i judge him based on those questions.
I have been a tester for an eCommerce company startup in the travel business, doing mostly Manuel testing which I found quite boring. Over the last 8 months (my first job as a QA) and over these 8 month I have been warned once that I need to improve or I will be let go, which I did.
Yesterday I was notified that my performance had gotten very bad again which surprised me because I was told I was doing good up to a month a go and wasn't given any feedback to indicate that I am getting worse.
I have my doubts about the validity of that since there was no warning + the company just went into it's second round of fundraising and has been removing a lot of people over past month. 6 out of the 27 workers to be exact.
I have been quite shaken by the situation because I take my job very much to heart and am sorry to say that it hurt my confidence in my testing skills.
So now that I have explained the situation to my question.
Should I try to find another job in QA or call it a day with this profession or am I overreacting and just taking this failure a bit too much to heart? I feel like I know the answer to this but I wanted to hear from people with more experience than I do.
I am working as a manual and automation tester for a financial company.
Testers in my company don't have access to the source code of our applications, we have limited access rights to our version control repositories (git). This is due to confidentiality and security obligations - a general rule says that each employee in the company should have only a minimal access to resources (information, systems etc.) which are essential to complete her work, but access to information not related to her work should be limited.
Someone has decided that testers have not needed the access to the source code to complete their work. It was long ago, at a time when we were using a waterfall model, and QA team was separated from development. But now we are transitioning to Agile, testers and devs are now combined into one scrum.
I have discussed this topic with a few developers that have worked for other companies in their careers, and they have told me that this is a "strange" rule, and testers have the access to the codebase elsewhere.
I am going to convince management to relax this rule and give testers at least the read-only access to the source code, and I am looking for arguments for this change - and also if there are any potential disadvantages or threats?
Some arguments for I can imagine:
Some webpages display elements based on viewport size of the browser. I have used CheckMyLinks XPI with chromedriver in a test Automation suite in the past and it provided consistent results.
Please ensure you use same viewport size and check for consistency of results or follow the below approach to verify if viewport size impacts number of links on the page:
$x("//*[contains(@href,'http')]")and hit return
Observe the number of elements it returns. Use same approach for a different viewport size.
Note: If the href's on the page use relative links, you may modify the xpath accordingly
I have recorded actions with Ranorex and I want to provide a reusable method. I have selected all actions and chosen the context menu entry "Merge items to user code item", which generated C# code for me.
Unfortunately the code is not clean, because it does not use a decorator pattern for the log statements. Every second line is a
Report.Log() call. Aside of this architectural flaw, is there a way I could generate user code without those
I have tried:
You could use the waitForElementPresent command befaroe saving the calculated value.
Use for pointing to the input field not only the id in a XPath, but also that value should not be null.
//input[@id='Save' and not @value='']
(not 100% sure if this xpath is correct)
we are using cucumber JS with sync-reqest to do headless end to end testing of our web app. Each scenario has its own x-correlation-id which is set as one of each call's headers - which makes it easier using kibana to track all the logs issued for each scenario. We save three types of log from cucumber - a pretty, a json and all the console.log messages saved to file. Using this
cucumber-js.cmd -f json:results\cucumber-results.json -f pretty:results\cucumber-results-pretty.txt > results\cucumber-console.log
The correlation ID is saved to the cucumber-console.log but as we log a lot of stuff to this file, and the file can take a while to arrive in the jenkins build artifacts I would like to add that to the pretty results along with the assertion failure. the pretty results are output to the jenkins log when the test run has finished (which is then followed by the UI tests). I am trying to understand the hooks but the information out there is either easy to understand and ruby or scarce and JS.