Testing Blog

How We Tested Google Instant Pages

streda, júla 27, 2011
Share on Twitter Share on Facebook
Google
Menovky: Jason Arbon , Tejas Shah

14 komentárov :

  1. Chris Kenst27. júla 2011 o 15:59:00 GMT-7

    How extensively does Google use Selenium for test automation and in what ways? Thanks

    Chris

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  2. BlackTigerX27. júla 2011 o 18:20:00 GMT-7

    Why couldn't you just preload the HTML and all associated files? Does the rendering take that much?

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  3. Kazumatan27. júla 2011 o 20:54:00 GMT-7

    It sounds like you are hinting at setting up regression tests of application generated DOMs. The differences are screened by humans. The keys would be (1) creating/choosing test scripts that generate plenty of coverage maximizing true positives and minimizing false positives. (2) Possibly masking out parts of the DOM that are likely to change most of the time. (3) Make it it easy and fast for humans to review the possible regressions and of course report the true regressions.

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  4. bmaher28. júla 2011 o 2:58:00 GMT-7

    Are there any plans to OS this utility as I would be interested to see how I could use something like this in my work.

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  5. Ben28. júla 2011 o 10:29:00 GMT-7

    Hello.
    That quality bot you mention sounds really cool. Do you know if this would ever be made available? Sounds like it would make a good companion to selenium/webdriver

    OdpovedaťOdstrániť
    Odpovede
    1. Anonymný23. mája 2016 o 10:55:00 GMT-7

      Python + Selenum (ScreenShots!) + ImageMagic.
      1) Get the page, save it to a db/archive.
      2) Compare against last/base.

      Odstrániť
      Odpovede
        Odpovedať
    2. Odpovedať
  6. Anonymný28. júla 2011 o 11:54:00 GMT-7

    Chris, Selenium (and Webdriver) are used very heavily at Google, and we have a centralized farm of Selenium machines to execute these tests around the clock.

    BlackTigerX, rendering does take time and every millisecond is interesting :) Also, for some of the larger, script-driven and AJAXy sites, they need the full DOM loaded to complete rendering.

    Kazumatan, you are right. We are also working to make it easy for the human raters to label the non-interesting but changing portions of the DOM that is changing (think Google Feedback style region selection), for later filtering.

    Dojann, Open Sourcing is definitely on the map, only the timing is a question. We've designed most of it so that it could run outside of Google's infrastructure for just this reason :) We are also looking at hosting options that let folks easily run on hosted machines they own, with VPN access to their staging environments. I can't speak to the timing, but it is partly dependent on the level of interest from the community in these options.

    Ben, yup, we are hoping to share the service and code 'soon'. The more interest we see, the faster this will happen.

    cheers!

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  7. James Whittaker28. júla 2011 o 13:39:00 GMT-7

    Expect major updates and perhaps even OS at GTAC in October.

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  8. Raghav1. augusta 2011 o 3:08:00 GMT-7

    I have two questions

    1. Comparing page rendering with instant pages turned off and on seems cool.. But, there should have been some tool/ automation that verified the rendering of pages before even the instant pages feature was introduced.. How was that being done and why wasn't that used here?

    2. How did the pixel/DOM comparison solve the problem of dynamically generating ads? Did you just verify that the place holders/DOM elements and not the content?

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  9. Anonymný2. augusta 2011 o 14:13:00 GMT-7

    Great questions Raghav...Chrome and many internal team at google uses variety of tools including "quality bots" to automatically verify rendering and catch layout issues. Reason we had to use quality bots here is because it does work at scale automagically vs most traditional automation tools requires custom test for each page which is hard to make it scale. Also, we need to keep in mind that page is hidden until made visible and only way to know page was prerendered is to have injected JS while page was in pre-rendered state.

    Re-Dynamic Ads: You are right. In general Bots verify information about the elements, but not the content. On top of that we have ad detection mechanism in place to detect and ignore ads while comparing.

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  10. Anairda5. augusta 2011 o 7:11:00 GMT-7

    Will there be any tutorials on building these kinds of bots?

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  11. Thiago F Peçanha12. augusta 2011 o 9:12:00 GMT-7

    Could you used Sikuli, to compare the images ?

    Regards,
    Thiago Peçanha

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  12. Tim16. augusta 2011 o 5:53:00 GMT-7

    We use pixel comparison (against a defined baseline) at our company in our automated regression testing. We have found that dynamically generated ads have caused a lot of problems. To start with we just set the success threshold lower, but this was not satisfactory. So now we only do the pixel comparison for pages with web ads.

    I'm thinking about solving this problem by asserting that partial images can be found within the page being tested.

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
  13. Anonymný19. augusta 2011 o 14:21:00 GMT-7

    Photographer-Anairda. Likely no tutorials soon, we hope to do one better and just open source it, document it, and let folks re-host their own instances if they like. I'm happy to chat about the details. The crawler is relatively easy--think lots of elementFromPoint() calls, the problems are in scale and reporting and rendering the data.

    Hi Tiago. Sikuli is very cool. Some folks have used it at Google, some have built similar approaches, and there are even some commercial products that work this way. We fundamentally have focused on the DOM diff, instead of the pixel diff for three reasons. 1. When you detect and file a 'bug' based on a screenshot, it is a significant amount of work to repro-debug the underlying issue that caused the pixels to be off, so why not just get that data while you are on the site? 2. If you know the structure of the web page, you dont need to use fancy and probabilistic approaches to identify elements that have either scaled, translated, or failed to appear--you know exactly which DOM elements have have failed. 3. We are building a corpus of which elements tend to cause differences, so we can hopefully correlate failures across many sites/runs, to determine if there are underlying issues in the browsers, tooling, or DOM usage--thats the ultimate goal. Great question--this is fundamental to the what and why of Bots.

    Hi Cithan. False positives and noise from ads was the reason a lot of people avoided this area, thinking it couldn't be useful data :) We use an 'ignore' filter for data from common ad-like sites during our crawl. We also are working on a way for our first-line crowd sourced evaluators to mark page areas as dont care, on a per-site basis to add the filter set. Most significantly though, we also have the notion of a 'baseline' for a site. If the site permutes all the time, but within a range, you can choose to only flag sites as they go outside of that normal range. Many top portal site data looks like this...the urls and divs shift around a bit day to day, but they amount of entropy day over day stays within a normal range/band.

    OdpovedaťOdstrániť
    Odpovede
      Odpovedať
Pridať komentár
Načítať viac...

The comments you read and contribute here belong only to the person who posted them. We reserve the right to remove off-topic comments.

  

Labels


  • TotT 104
  • GTAC 61
  • James Whittaker 42
  • Misko Hevery 32
  • Code Health 31
  • Anthony Vallone 27
  • Patrick Copeland 23
  • Jobs 18
  • Andrew Trenk 13
  • C++ 11
  • Patrik Höglund 8
  • JavaScript 7
  • Allen Hutchison 6
  • George Pirocanac 6
  • Zhanyong Wan 6
  • Harry Robinson 5
  • Java 5
  • Julian Harty 5
  • Adam Bender 4
  • Alberto Savoia 4
  • Ben Yu 4
  • Erik Kuefler 4
  • Philip Zembrod 4
  • Shyam Seshadri 4
  • Chrome 3
  • Dillon Bly 3
  • John Thomas 3
  • Lesley Katzen 3
  • Marc Kaplan 3
  • Markus Clermont 3
  • Max Kanat-Alexander 3
  • Sonal Shah 3
  • APIs 2
  • Abhishek Arya 2
  • Alan Myrvold 2
  • Alek Icev 2
  • Android 2
  • April Fools 2
  • Chaitali Narla 2
  • Chris Lewis 2
  • Chrome OS 2
  • Diego Salas 2
  • Dori Reuveni 2
  • Jason Arbon 2
  • Jochen Wuttke 2
  • Kostya Serebryany 2
  • Marc Eaddy 2
  • Marko Ivanković 2
  • Mobile 2
  • Oliver Chang 2
  • Simon Stewart 2
  • Stefan Kennedy 2
  • Test Flakiness 2
  • Titus Winters 2
  • Tony Voellm 2
  • WebRTC 2
  • Yiming Sun 2
  • Yvette Nameth 2
  • Zuri Kemp 2
  • Aaron Jacobs 1
  • Adam Porter 1
  • Adam Raider 1
  • Adel Saoud 1
  • Alan Faulkner 1
  • Alex Eagle 1
  • Amy Fu 1
  • Anantha Keesara 1
  • Antoine Picard 1
  • App Engine 1
  • Ari Shamash 1
  • Arif Sukoco 1
  • Benjamin Pick 1
  • Bob Nystrom 1
  • Bruce Leban 1
  • Carlos Arguelles 1
  • Carlos Israel Ortiz García 1
  • Cathal Weakliam 1
  • Christopher Semturs 1
  • Clay Murphy 1
  • Dagang Wei 1
  • Dan Maksimovich 1
  • Dan Shi 1
  • Dan Willemsen 1
  • Dave Chen 1
  • Dave Gladfelter 1
  • David Bendory 1
  • David Mandelberg 1
  • Derek Snyder 1
  • Diego Cavalcanti 1
  • Dmitry Vyukov 1
  • Eduardo Bravo Ortiz 1
  • Ekaterina Kamenskaya 1
  • Elliott Karpilovsky 1
  • Elliotte Rusty Harold 1
  • Espresso 1
  • Felipe Sodré 1
  • Francois Aube 1
  • Gene Volovich 1
  • Google+ 1
  • Goran Petrovic 1
  • Goranka Bjedov 1
  • Hank Duan 1
  • Havard Rast Blok 1
  • Hongfei Ding 1
  • Jason Elbaum 1
  • Jason Huggins 1
  • Jay Han 1
  • Jeff Hoy 1
  • Jeff Listfield 1
  • Jessica Tomechak 1
  • Jim Reardon 1
  • Joe Allan Muharsky 1
  • Joel Hynoski 1
  • John Micco 1
  • John Penix 1
  • Jonathan Rockway 1
  • Jonathan Velasquez 1
  • Josh Armour 1
  • Julie Ralph 1
  • Kai Kent 1
  • Kanu Tewary 1
  • Karin Lundberg 1
  • Kaue Silveira 1
  • Kevin Bourrillion 1
  • Kevin Graney 1
  • Kirkland 1
  • Kurt Alfred Kluever 1
  • Manjusha Parvathaneni 1
  • Marek Kiszkis 1
  • Marius Latinis 1
  • Mark Ivey 1
  • Mark Manley 1
  • Mark Striebeck 1
  • Matt Lowrie 1
  • Meredith Whittaker 1
  • Michael Bachman 1
  • Michael Klepikov 1
  • Mike Aizatsky 1
  • Mike Wacker 1
  • Mona El Mahdy 1
  • Noel Yap 1
  • Palak Bansal 1
  • Patricia Legaspi 1
  • Per Jacobsson 1
  • Peter Arrenbrecht 1
  • Peter Spragins 1
  • Phil Norman 1
  • Phil Rollet 1
  • Pooja Gupta 1
  • Project Showcase 1
  • Radoslav Vasilev 1
  • Rajat Dewan 1
  • Rajat Jain 1
  • Rich Martin 1
  • Richard Bustamante 1
  • Roshan Sembacuttiaratchy 1
  • Ruslan Khamitov 1
  • Sam Lee 1
  • Sean Jordan 1
  • Sebastian Dörner 1
  • Sharon Zhou 1
  • Shiva Garg 1
  • Siddartha Janga 1
  • Simran Basi 1
  • Stan Chan 1
  • Stephen Ng 1
  • Tejas Shah 1
  • Test Analytics 1
  • Test Engineer 1
  • Tim Lyakhovetskiy 1
  • Tom O'Neill 1
  • Vojta Jína 1
  • automation 1
  • dead code 1
  • iOS 1
  • mutation testing 1


Archive


  • ►  2025 (1)
    • ►  jan (1)
  • ►  2024 (13)
    • ►  dec (1)
    • ►  okt (1)
    • ►  sep (1)
    • ►  aug (1)
    • ►  júl (1)
    • ►  máj (3)
    • ►  apr (3)
    • ►  mar (1)
    • ►  feb (1)
  • ►  2023 (14)
    • ►  dec (2)
    • ►  nov (2)
    • ►  okt (5)
    • ►  sep (3)
    • ►  aug (1)
    • ►  apr (1)
  • ►  2022 (2)
    • ►  feb (2)
  • ►  2021 (3)
    • ►  jún (1)
    • ►  apr (1)
    • ►  mar (1)
  • ►  2020 (8)
    • ►  dec (2)
    • ►  nov (1)
    • ►  okt (1)
    • ►  aug (2)
    • ►  júl (1)
    • ►  máj (1)
  • ►  2019 (4)
    • ►  dec (1)
    • ►  nov (1)
    • ►  júl (1)
    • ►  jan (1)
  • ►  2018 (7)
    • ►  nov (1)
    • ►  sep (1)
    • ►  júl (1)
    • ►  jún (2)
    • ►  máj (1)
    • ►  feb (1)
  • ►  2017 (17)
    • ►  dec (1)
    • ►  nov (1)
    • ►  okt (1)
    • ►  sep (1)
    • ►  aug (1)
    • ►  júl (2)
    • ►  jún (2)
    • ►  máj (3)
    • ►  apr (2)
    • ►  feb (1)
    • ►  jan (2)
  • ►  2016 (15)
    • ►  dec (1)
    • ►  nov (2)
    • ►  okt (1)
    • ►  sep (2)
    • ►  aug (1)
    • ►  jún (2)
    • ►  máj (3)
    • ►  apr (1)
    • ►  mar (1)
    • ►  feb (1)
  • ►  2015 (14)
    • ►  dec (1)
    • ►  nov (1)
    • ►  okt (2)
    • ►  aug (1)
    • ►  jún (1)
    • ►  máj (2)
    • ►  apr (2)
    • ►  mar (1)
    • ►  feb (1)
    • ►  jan (2)
  • ►  2014 (24)
    • ►  dec (2)
    • ►  nov (1)
    • ►  okt (2)
    • ►  sep (2)
    • ►  aug (2)
    • ►  júl (3)
    • ►  jún (3)
    • ►  máj (2)
    • ►  apr (2)
    • ►  mar (2)
    • ►  feb (1)
    • ►  jan (2)
  • ►  2013 (16)
    • ►  dec (1)
    • ►  nov (1)
    • ►  okt (1)
    • ►  aug (2)
    • ►  júl (1)
    • ►  jún (2)
    • ►  máj (2)
    • ►  apr (2)
    • ►  mar (2)
    • ►  jan (2)
  • ►  2012 (11)
    • ►  dec (1)
    • ►  nov (2)
    • ►  okt (3)
    • ►  sep (1)
    • ►  aug (4)
  • ▼  2011 (39)
    • ►  nov (2)
    • ►  okt (5)
    • ►  sep (2)
    • ►  aug (4)
    • ▼  júl (2)
      • How We Tested Google Instant Pages
      • GTAC: Call for Team Attendance
    • ►  jún (5)
    • ►  máj (4)
    • ►  apr (3)
    • ►  mar (4)
    • ►  feb (5)
    • ►  jan (3)
  • ►  2010 (37)
    • ►  dec (3)
    • ►  nov (3)
    • ►  okt (4)
    • ►  sep (8)
    • ►  aug (3)
    • ►  júl (3)
    • ►  jún (2)
    • ►  máj (2)
    • ►  apr (3)
    • ►  mar (3)
    • ►  feb (2)
    • ►  jan (1)
  • ►  2009 (54)
    • ►  dec (3)
    • ►  nov (2)
    • ►  okt (3)
    • ►  sep (5)
    • ►  aug (4)
    • ►  júl (15)
    • ►  jún (8)
    • ►  máj (3)
    • ►  apr (2)
    • ►  feb (5)
    • ►  jan (4)
  • ►  2008 (75)
    • ►  dec (6)
    • ►  nov (8)
    • ►  okt (9)
    • ►  sep (8)
    • ►  aug (9)
    • ►  júl (9)
    • ►  jún (6)
    • ►  máj (6)
    • ►  apr (4)
    • ►  mar (4)
    • ►  feb (4)
    • ►  jan (2)
  • ►  2007 (41)
    • ►  okt (6)
    • ►  sep (5)
    • ►  aug (3)
    • ►  júl (2)
    • ►  jún (2)
    • ►  máj (2)
    • ►  apr (7)
    • ►  mar (5)
    • ►  feb (5)
    • ►  jan (4)

Feed

  • Google
  • Privacy
  • Terms