Customer Use Case Archives - Automated Visual Testing | Applitools https://applitools.com/blog/tag/customer-use-case/ Applitools delivers the next generation of test automation powered by AI assisted computer vision technology known as Visual AI. Fri, 01 Dec 2023 19:13:50 +0000 en-US hourly 1 Test Automation Video Summer Roundup: May-August 2022 https://applitools.com/blog/test-automation-video-summer-2022-roundup/ Fri, 05 Aug 2022 17:07:03 +0000 https://applitools.com/?p=41666 Get all the latest test automation videos you need right here. All feature test automation experts sharing their knowledge and their stories.

The post Test Automation Video Summer Roundup: May-August 2022 appeared first on Automated Visual Testing | Applitools.

]]>

Get all the latest test automation videos you need in one place.

It’s summertime (at least where I am in the US), and this year has been a hot one. Summer is a great season to take a step back, to reflect, and hopefully to relax. The testing world moves so quickly sometimes, and while we’re all doing our jobs it can be hard to find the time to just pause, take a deep breath, and look around you at everything that’s new and growing.

Here at Applitools, we want to help you out with that. While you’ve hopefully been enjoying the nice weather, you may not have had a chance to see every video or event that you might have wanted to, or you may have missed some new developments you’d be interested in. So we’ve rounded up a few of our best test automation videos of the summer so far in one place.

All speakers are brilliant testing experts and we’re excited to share their talks with you – you’ll definitely want to check them all out below.

ICYMI: A few months back we also rounded up our top videos from the first half of 2022.

The State of UI/UX Testing: 2022 Results

Earlier this year, Applitools set out to conduct an industrywide survey on the state of testing in the UI/UX space. We surveyed over 800 testers, developers, designers, and digital thought leaders on the state of testing user interfaces and experiences in modern frontend development. Recently, our own Dan Giordano held a webinar to go over the results in detail. Take a look below – and don’t forget to download your free copy of the report.

Front-End Test Fest 2022

Front-End Test Fest 2022 was an amazing event, featuring leading speakers and testing experts sharing their knowledge on a wide range of topics. If you missed it, a great way to get started is with the thought-provoking opening keynote for the event given by Andrew Knight, AKA the Automation Panda. In this talk, titled The State of the Union for Front End Testing, Andrew explores seven major trends in front end testing to help unlock the best approaches, tools and frameworks you can use.

For more on Front-End Test Fest 2022 and to see all the talks, you can read this dedicated recap post or just head straight to our video library for the event.

Cypress Versus Playwright: Let the Code Speak

There are a lot of opinions out there on the best framework for test automation – why not let the code decide? In the latest installment in our popular versus series, Andrew Knight backs Playwright and goes head to head with Cypress expert Filip Hric. Round for round, Filip and Andy implement small coding challenges in JavaScript, and the live audience voted on the best solution. Who won the battle? You’ll have to watch to find out.

Just kidding, actually – at Applitools we want to make gaining testing knowledge easy, so why would we limit you to just one way of finding the answer? Filip Hric summarizes the code battle (including the final score) in a great recap blog post right here.

Can’t get enough of Cypress vs Playwright? Us either. That’s why we’re hosting a rematch to give these two heavyweights another chance to go head to head. Register today for to be a part of the Cypress vs Playwright Rematch Event on September 8th!

Coded vs. Codeless Testing Tools—And the Space In Between

There are a lot of testing debates out there, and coded vs codeless testing tools is one of the big ones. How can you know which is better, and when to use one or the other? Watch this panel discussion to see leading automation experts discuss the current landscape of coded and codeless tools. Learn what’s trending, common pitfalls with each approach, how a hybrid approach could work, and more.

Your panel for this event includes our own Anand Bagmar and Andrew Knight, along with Mush Honda, Chief Quality Architect and Coty Resenblath, CTO, both from Katalon.

Autonomous Testing, Test Cloud Infrastructure, and Emerging Trends in Software Testing

Looking to get a handle on the where testing is heading in the future? Hear from our Co-Founder and CEO, Gil Sever, as he sits down for a Q&A with QA Financial to discuss the future of testing. Learn about the ways autonomous testing is transforming the market, advancements in the cloud and AI, and the ups and downs of where testing could go in the next few years. Gil also shares insights he’s learned from our latest State of UI/UX Testing survey.

Test Automation Stories from Our Customers

We know that every day you and countless others are innovating in the test automation space, encountering challenges and discovering – or inventing – impressive solutions. Our hope is that hearing how others have solved a similar problem will help you understand that you’re not alone in facing these obstacles, and that their stories will give you a better understanding of your own challenges and spark new ways of thinking.

Automating Manufacturing Quality Control with Visual AI

We all know about web and mobile regression testing, but did you know that Visual AI is solving problems in the manufacturing space as well? Jerome Rieul, Test Automation Architect, explains how a major Swiss luxury brand uses uses Visual AI to detect changes in CAD drawings and surface issues before they hit production lines. A great example of an out-of-the-box application of technology leading to fantastic results.

Simplifying Test Automation with Codeless Tools and Visual AI

Test automation can be hard, and many shops struggle to do it effectively. One way to lower the learning curve is to take advantage of a codeless test automation tool – and that doesn’t mean you have to forego advanced and time-saving capabilities like Visual AI. In this webinar Applitools’ Nikhil Nigam shares how Visual AI can integrate seamlessly with codeless tools like Selenium IDE, Katalon Studio, and Tosca to supercharge verifications and meet industrial-grade needs. (And for more on codeless testing tools, don’t forget to watch our lively panel discussion!)

How EVERFI Moved from No Automation to Continuous Test Generation in 9 Months

Starting up test automation from scratch can be a daunting challenge – but it’s one that countless testing teams across the world have faced before you. In this informative talk, Greg Sypolt, VP of Quality Engineering, and Sneha Viswalingam, Director of Quality Engineering, both from EVERFI, share their journey. Learn about the tools they used, how they approached the project, and the time and productivity savings they achieved.

More to Come!

This is just a selection of our favorite test automation videos that we’ve shared with the community this summer. We’re continuously sharing more too – keep an eye on our upcoming events page to see what we have in store next.

What were your favorite videos? Check out our full video library here, and you can let us know your own favorites @Applitools.

The post Test Automation Video Summer Roundup: May-August 2022 appeared first on Automated Visual Testing | Applitools.

]]>
Our Best Test Automation Videos of 2022 (So Far) https://applitools.com/blog/best-test-automation-videos-2022/ Fri, 20 May 2022 21:07:55 +0000 https://applitools.com/?p=38624 Check out some of our most popular events of the year. All feature test automation experts sharing their knowledge and their stories.

The post Our Best Test Automation Videos of 2022 (So Far) appeared first on Automated Visual Testing | Applitools.

]]>

We’re approaching the end of May, which means we’re just a handful of weeks the midpoint of 2022 already. If you’re like me, you’re wondering where the year has gone. Maybe it has to do with life in the northeastern US where I live, where we’ve really just had our first week of warm weather. Didn’t winter just end?

As always, the year is flying by, and it can be hard to keep up with all the great videos or events you might have wanted to watch or attend. To help you out, we’ve rounded up some of our most popular test automation videos of 2022 so far. These are all top-notch workshops or webinars with test automation experts sharing their knowledge and their stories – you’ll definitely want to check them out.

Cross Browser Test Automation

Cross-browser testing is a well-known challenge to test automation practitioners. Luckily, Andy Knight, AKA the Automation Panda, is here to walk you through a modern approach to getting it done. Whether you use Cypress, Playwright, or are testing Storybook components, we have something for you.

Modern Cross Browser Testing with Cypress

For more, see this blog post: How to Run Cross Browser Tests with Cypress on All Browsers (plus bonus post specifically covering the live Q&A from this workshop).

Modern Cross Browser Testing in JavaScript Using Playwright

For more, see this blog post: Running Lightning-Fast Cross-Browser Playwright Tests Against any Browser.

Modern Cross Browser Testing for Storybook Components

For more, see this blog post: Testing Storybook Components in Any Browser – Without Writing Any New Tests!

Test Automation with GitHub or Chrome DevTools

GitHub and Chrome DevTools are both incredibly popular with the developer and testing communities – odds are if you’re reading this you use one or both on a regular basis. We recently spoke with developer advocates Rizel Scarlett of GitHub and Jecelyn Yeen of Google as they explained how you can leverage these extremely popular tools to become a better tester and improve your own testing experience. Click through for more info about each video and get watching.

Make Testing Easy with GitHub

For more, see this blog post: Using GitHub Copilot to Automate Tests.

Automating Tests with Chrome DevTools Recorder

For more, see this blog post: Creating Your First Test With Google Chrome DevTools Recorder.

Test Automation Stories from Our Customers

When it comes to implementing and innovating around test automation, you’re never alone, even though it doesn’t always feel that way. Countless others are struggling with the same challenges that you are and coming up with solutions. Sometimes all it takes is hearing how someone else solved a similar problem to spark an idea or gain a better understanding of how to solve your own.

Accelerating Visual Testing

Nina Westenbrink, Software Engineer at a leading European telecom, talks about how the visual time to test the company’s design system was decreased and simplified, offering helpful tips and tricks along the way. Nina also speaks about her career as a woman in testing and how to empower women and overcome biases in software engineering.

Continuously Testing UX for Enterprise Platforms

Govind Ramachandran, Head of Testing and Quality Assurance for Asia Technology Services at Manulife Asia, discusses challenges around UI/UX testing for enterprise-wide digital programs. Check out his blueprint for continuous testing of the customer experience using Figma and Applitools.

This is just a taste of our favorite videos that we’ve shared with the community from 2022. What were yours? You can check out our full video library here, and let us know your own favorites @Applitools.

The post Our Best Test Automation Videos of 2022 (So Far) appeared first on Automated Visual Testing | Applitools.

]]>
Thunderhead Speeds Quality Delivery with Applitools https://applitools.com/blog/thunderhead-speeds-quality-delivery-with-applitools/ Tue, 16 Feb 2021 07:15:36 +0000 https://applitools.com/?p=26911 Thunderhead is the recognised global leader in the Customer Journey Orchestration and Analytics market. The ONE Engagement Hub helps global brands build customer engagement in the era of digital transformation.  ...

The post Thunderhead Speeds Quality Delivery with Applitools appeared first on Automated Visual Testing | Applitools.

]]>

Thunderhead is the recognised global leader in the Customer Journey Orchestration and Analytics market. The ONE Engagement Hub helps global brands build customer engagement in the era of digital transformation.  

Thunderhead provides its users with great insights into customer behavior. To continue to improve user experience with their highly-visual web application, Thunderhead develops continuously. How does Thunderhead keep this visual user experience working well? A key component is Applitools.

Before – Using Traditional Output Locators

Prior to using Applitools, Thunderhead drove its UI-driven tests with Selenium for browser automation and Python as the primary test language. They used traditional web element locators both for setting test conditions and for measuring the page responses.

Element locators have been state-of-the-art for measuring page response because of precision. Locators get generated programmatically. Test developers can find any visual structure on the page as an element.

Depending on page complexity, a given page can have dozens, or even hundreds, of locators. Because test developers can inspect individual locators, they can choose which elements they want to check. But, locators limit inspection. If a change takes place outside the selected locators, the test cannot find the change.

These output locators must be maintained as the application changes. Unmaintained locators can cause test problems by reporting errors because the locator value has changed while the test has not. Locators may also remain the same but reflect a different behavior not caught by a test.

Thunderhead engineers knew about pixel diff tools for visual validation. They also had experience with those tools; they had concluded that pixel diff tools would be unusable for test automation because of the frequency of false positives.

Introducing Applitools at Thunderhead

When Thunderhead started looking to improve their test throughput, they came across Applitools. Thunderhead had not considered a visual validation tool, but Applitools made some interesting claims. The engineers thought that AI might be marketing buzz, but they were intrigued by a tool that could abstract pixels into visible elements.

As they began using Applitools, Thunderhead engineers realized that Applitools gave them the ability to inspect an entire page.  Not only that, Applitools would capture visual differences without yielding bogus errors. Soon they realized that Applitools offered more coverage than their existing web locator tests, with less overall maintenance because of reduced code.

The net benefits included:

  • Coverage – Thunderhead could write tests for each visible on-page element on every page
  • Maintainability – By measuring the responses visually, Thunderhead did not have to maintain all the web element locator code for the responses – reducing the effort needed to maintain tests
  • Visual Validation – Applitools helped Thunderhead engineers see the visual differences between builds under test, highlighting problems and aiding problem-solving.
  • Faster operation – Visual validation analyzed more quickly than traditional web element locators.

Moving Visual Testing Into Development

After. using Applitools in end-to-end testing, Thunderhead realized that Applitools could help in several areas.

First, Applitools could help with development. Often, when developers made changes to the user interface, unintended consequences could show up at check-in time. However, by waiting for end-to-end tests to expose these issues, developers often had to stop existing work and shift context to repair older code. By moving visual validation to check-in, Thunderhead could make developers more effective.

Second, developers often waited to run their full suite of element locator tests until final build. These tests ran against multiple platforms, browsers, and viewports. The net test run would take several hours. The equivalent test. using Applitools took five minutes. So, Thunderhead could run these tests with every build.

For Thunderhead, the net result was both greater coverage with tests run at the right time for developer productivity.

Adding Visual Testing to Component Tests

Most recently, Thunderhead has seen the value of using a component library in their application development. By standardizing on the library, Thunderhead looks to improve development productivity over time. Components ensure that applications provide consistency across different development teams and use cases.

To ensure component behavior, Thunderhead uses Applitools to validate the individual components in the library. Thunderhead also tests the components in mocks that demonstrate the components in typical deployment uses cases.

By adding visual validation to components, Thunderhead expects to see visual consistency validated much earlier in the application development cycle.

Other Benefits From Applitools

Beyond the benefits listed above, Thunderhead has seen the virtual elimination of visual defects found through end-to-end testing. The check-in and build tests have exposed the vast majority of visual behavior issues during the development cycle. They have also made developers more productive by eliminating the context switches previously needed if bugs were discovered during end-to-end testing. As a result, Thunderhead has gained greater predictability in the development process.

In turn, Thunderhead engineers have gained greater agility. They can try new code and behaviors and know they will visually catch all unexpected behaviors. As a result, they are learning previously-unexplored dependencies in their code base. As they expose these dependencies, Thunderhead engineers gain greater control of their application delivery process.

With predictability and control comes confidence. Using Applitools has given Thunderhead increased confidence in the effectiveness of their design processes and product delivery. With Applitools, Thunderhead knows how customers will experience the ONE platform and how that experience changes over time.

Featured photo by Andreas Steger on Unsplash

The post Thunderhead Speeds Quality Delivery with Applitools appeared first on Automated Visual Testing | Applitools.

]]>
Design Systems and Testability https://applitools.com/blog/testing-design-systems-2/ Tue, 16 Jun 2020 05:30:35 +0000 https://applitools.com/?p=19701 What is a design system? Who would use it, and for what benefit? In May 2020, Applitools had the pleasure of hosting Tyler Krupicka from Intuit for an hour-long webinar...

The post Design Systems and Testability appeared first on Automated Visual Testing | Applitools.

]]>

What is a design system? Who would use it, and for what benefit?

In May 2020, Applitools had the pleasure of hosting Tyler Krupicka from Intuit for an hour-long webinar discussing design systems and testability.

Tyler works at Intuit, a 9,400 employee company headquartered in Mountain View, California, that specializes in accounting and tax preparation software. At Intuit, Tyler works on the “Player/Design Systems” team, where he focuses on design systems.

Tyler has been working on web development at Intuit since 2015. These days, he works on UI components, tools, and testing. He primarily uses TypeScript, React and Vue. He also focuses on accessibility, and he serves as a “Level 2” Accessibility Champion inside Intuit.As part of Tyler’s job, he ensures that design systems are testable. 

What Is A Design System

Tyler started asking this question. Some people think about large design systems:

  • Material Design (Google’s design system)
  • Human Interface Guidelines (Apple’s design system)
  • Bootstrap (set of UI interface components and CSS that are popular on the web)

People look at these and think, “Is this overkill?” After all, Google uses Material Design for Android. How many people design an entire operating system. So, Tyler dove in to answer his question.

First, Tyler sees design systems as really “Design + Code.” One part of a design system are the “design” – the common components and patterns that get used throughout products. The other part of the design system, Code, involves chunks of functionality interacting with design developers can reuse across a product.  For example, CSS Frameworks, or React, or in some other way that lets teams reuse the same code.

Design

“Design” turns into a consistent voice for your product. Those are things like typography, colors, and spacing, as well as UI components (forms, charts, site navigation), Graphics and Animation standards. These help standardize the visualization – the “look” – of the application. One can imagine an application developed by a large team that lacked a standard design – it would be disorienting to customers who expect consistency from software. A design system lets  you take base pieces, document them well, and reuse them.

Code

“Code” lets you build and document components individually, test them once, and have a way to set expectations on whether or not this application will work.  For instance, you can build a component in the app for one persona and reuse that component multiple times inside the app without worrying that you must retest it. Reuse makes code standard and creates standards for interacting with design. If you ever change design elements, the standard code reduces the effort to validate the new design with existing code. 

Combining design and code into a design system results in increased developer productivity and a consistent voice for the application. And using a design system simplifies the management of an application.

Design Systems at Intuit: TurboTax Example

Next, Tyler dove into the use of design systems for the Intuit TurboTax product. 

Tyler explained how TurboTax works. By providing users with interview-style questions, TurboTax helps millions of people, primarily in the United States and Canada, file their income tax returns. Intuit offers both web and mobile versions of TurboTax.

Because TurboTax uses interview questions to create conditional flows through the product, there are thousands of screens users can encounter based on their unique needs. Screens can range in complexity from a single question to a complete form. 

Building the TurboTax Design System

To create the design, Intuit develops about 50 UI components needed to run the experience. Headers, radio buttons, tiles, multi-select, data entry, action buttons – all become part of the basic UI. 

Intuit also builds a set of about 500 “mocks”, or mock-ups, of what common screens look like for testing. These help show the UI components in common layouts for testing purposes.

By using a design system, Intuit can make changes to the application and ensure consistency for all these screens. 

Tyler gave the example of a decision to deliver a uniform increase in contrast in TurboTax for accessibility. The contrast changes required updates to typography weight, button color, and link color. This work would impact over 90% of all the screens. How could Intuit deliver these changes to the application and ensure they occurred consistently across the application? The team did not have the staff needed to manually check every possible screen. Instead, they made the design changes to the UI components and validated the mocks.

Testing A Design System

Next, Tyler dove into a discussion of testing a design system. 

Approach

Since the design system represents all the functionality and visual representation of the application, Tyler recommends going broad with testing across all the elements, including:

  • Functionality
  • Visual
  • Performance
  • Accessibility

From a functional perspective, unit tests serve as key building blocks for code. Integration tests ensure that the code works wiht the back end. Cross-vrowser tests help ensure that the code works on a bunch of different browsers and mobile devices. 

Visual testing validates the impact of UI component changes on the overall UI. If buttons change size or Intuit uses a new font size, does the page continue to render correctly? Do unexpected breaks happen on pages? Do space reductions make pages look too constrained?

Performance testing involves checking the impact of code changes on page loading times.  Specifically, do new packages or package upgrades change the amount of data transferred between the server and the browser – slowing the user experience? 

Finally, Tyler tests accessibility to make sure that any customer who wants to use the product can do so. Whether a user has a screen reader or navigates solely by keyboard, ensure that the user can get around.

Getting Organized

With testing goals articulated, Tyler introduced Storybook as the tool for creating documentation and initial testing. If you use projects like React, Vue and Angular that allow you to build user interfaces in small components, the Storybook project lets you build interactions for those components and document their behavior.  

Effectively, you demonstrate a  use case for your components and show how they all interact together on a page.  Each use case is one story – hence Storybook is a collection of these stories. 

Storybook allows for a range of tests, including accessibility tests. For instance, Tyler uses Storybook to test keyboard navigation and focus. He also tests for things like hover behaviors – where components appear while the cursor hovers over another element, which may be difficult to test with automation. Testing with Storybook lets the testing team run tests early across the range of design system dimensions.

Dog Fooding

To deliver successful products, Tyler’s team must support the construction and use of these UI components. So, they actually write code to try out the components – just to see how they work. Putting together checkboxes or radio buttons, or drop-down lists helps show how the code interacts with components and creates mocks for testing the design system.

Tyler showed the example of a page that looked like static boxes. Each box expanded when clicked to show hidden detail and slid the rest of the page down. His team dog foods this kind of behavior.

All of these activities help create a body of testable code, so that a design change in the design system can be mocked up and validated without impacting production code.

Testing Walkthrough

Tyler then walked us through a simple change that can have a signficant implications – resizing a button. 

Without a design system, you might have to make a button change like this ad-hoc across your application. As a result, you risk changing the page-to-page behavior and look-and-feel. Using a design system lets you make the change globally, and then focus on the implications.

What kinds of implications? A larger button might displace elements on a page. The button may not function as expected with existing elements on a given page. All these changes need to be tested.

Unit Tests

First, Tyler looked at unit tests. Does the button click? Can the button work with an icon associated with it? Will the button work with an transparent background? Does the button work with assigned ARIA (Accessible Rich Internet Applications) attributes? Testing puts these Storybook components through their paces.  

Intuit have a pretty standard way of running through these tests.  Tyler talked about using JEST and WebdriverIO and Selenium to automate this testing across multiple browsers. They have lots of users on multiple web browsers, as well as Mac users with Safari an iOS, so they pay special attention to those target platforms. 

Accessibility Tests

Next, Tyler talked about accessibility testing. His team uses AXE to perform static validation for ARIA standards on application pages, including contrast checking.  Tyler explained how AXE helps perform contrast checking. For instance, if a button is one color, does the background let the button stand out? AXE can do the contrast checking by comparing the two colors. Or, will the text be readable against its background – or is it too small or too thin? Again – AXE can make these calculations.

Tyler also talked about HTML customizations. For example, you want to use a special radio button design that isn’t included in the existing HTML. You build this behavior as a custom implementation – and then you have to validate this behavior across the browsers your customers use. Tyler pointed out that companies make these decisions to help brand their websites for customers. And, these custom behaviors must be validated for functionality and accessible behavior across multiple platforms.  At the same time, since you are replacing functionality built into HTML, keyboard navigation still needs to behave as expected. So, you need to test those behaviors. 

At Intuit, the team tests these behaviors manually – and with tools designed to help. By using popular screenreaders like Jaws, Voiceover, and NVDA, Tyler’s team ensures that these screen readers don’t get thrown off by customizations. Also through manual testing, the team checks to ensure that keyboard navigation results in expected page responses – for example, they make sure that focus states behave as expected. And, they also ensure that the page behaves correctly when a customer has selected “reduced motion”.

Performance Tests

From a design perspective, the design system simplifies app delivery. By using standard libraries and building standard behaviors, the design system creates this great look-and-feel. At the same time, developers need to consider the impact on performance. For instance, a specific package might add great mobile behavior – at the cost of an additional 1-2 seconds of page load time. Users likely won’t value the behavior over slower page load time.  

You can choose a great library that gives you great functionality – but at what cost? How does the size of your Javascript package impact your page downloads? Often, this problem creeps up on you.  You use an early version of the library and everything works well. But, when you update your application, your library update suddenly doubles in size. This happens quite often. New versions of libraries include new features – which add to the library load times. And, you have to rethink the software you’re using.

Tyler talked about how Intuit handles these kinds of challenges. If Intuit uses code from an open source project, they might contribute back to the project to make a library with reduced size. Or, they might use a different open-source package.  

Visual Testing

Each time a new component gets introduced or modified, Intuit runs tests through Applitools Ultrafast Grid. Intuit has been running Grid since its first beta tests. With Ultrafast Grid, Intuit can set checkpoints and take screenshots across a range of browsers and devices. They can set multiple screen sizes (from mobile to Full HD and  higher). Their goal – ensure that responsive pages behave as expected, and understand the impact of design changes on existing pages. Ultrafast grid gives them the capability to easily compare changes across different versions of an application.

Tyler talked about using Ultrafast Grid to test against Chrome, Edge, and Safari. He talked about setting different viewport sizes to check responsive page behavior against expectations. 

Component and Mock Level

Tyler explained how Intuit tests components and mocks. 

In Storybook, Intuit keeps the latest components, as well as page mocks for each component in use. When a change occurs in the design system, Intuit can do two kinds of checks:

  1. They can check the components to see if the component changes have violated any specifications
  2. They check the mocks, to determine how the component changes affect page rendering.

So, for instance, Intuit can see if a collection of multiple new components violate spacing guidelines or other design requirements on pages. Applitools Ultrafast Grid helps Intuit do this evaluation.

Also using the mocks, Intuit can also see if a change violates accessibility guidelines using AXE.  

Accessing Intuit Tools for Design Systems

Tyler discussed the two tools Intuit uses to ensure success in their design systems.

The first, the Intuit DS-CLI, makes it easy to implement Storybook for your designs. Just like Create-React-App lets you build a web app easily, DS-CLI contains all the code you need to utilize Storybook for component and mock management. This includes preconfigured Storybook, component templates with best practices, great Typescript support, and bundle size tracking.  Tyler mentioned that DS-CLI is free and open source

One great part about DS-CLI is size tracking.  Each component can be tracked over time for package size. You can see when your bundles are growing – and why.

With the second, Intuit Proof, you can automate the development of test running in Storybook. Proof is a tapable testrunner for Storybook. Proof helps you automate tests for components and mocks by helping you know which ones have tests. In addition, Proof has built-in hooks for Applitools and Axe. So, instead of fumbling about inside Storybook to see what has been tested and what remains, you know what tests exist, which still need to be written, and helps you track their status. 

Proof is also free and open source.

Conclusion

Tyler’s presentation made it clear that a design system can make your life much easier, but you need to invest in tools and methodology to help ensure consistency – the benefit of a design system. 

You need a component evaluation system. You need mockups of your pages to test. And you need a testing environment.  Intuit does all this inside Storybook.

Next, you need a testing strategy to get components evaluated and integrated into your pages. Invest in unit testing and test runners. Especially if you care about being perceived as open to all users – invest in accessibility tools. And, finally, know that rendering matters – invest in visual evaluation solutions like Applitools.

Finally, you need some pieces of infrastructure to make your design system work. Intuit shares theirs open source.

If you make these investments up front, a design system can serve your development and test strategy to help you easily develop incremental improvements across your application without fear of code changes. 

Read More

The post Design Systems and Testability appeared first on Automated Visual Testing | Applitools.

]]>