All Dressed in Data: How Today’s Wearables Can Inform Tomorrow’s Fashion

Believe it or not, a lot of the character associated with trends—whether it’s a fitted silhouette, bold colors, or sharp accessories—all boils down to data. Consider the power of a photograph: when converted into data, that online photo of you and your friends looking stylish at a wedding could soon spark a chain-reaction that starts a regional fashion trend. The catalyst? All that precious data you’re wearing.

As machine learning and wearables get better at reading imagery—and as online images proliferate—different types of data become accessible to neural platforms capable of creating new ways to relate to clothing and trends. “Photo and video content are now at the same level as other types of data,” says Jessica Graves, data scientist and founder of fashion consultancy Sefleuria. “Every image ever produced by a fashion company is now data: all magazine photos, all archives. And in a structured, useful format that connects back to customer behavior.” Tracing the path to fashion’s future calls for creating new and innovative ways of collecting this data. The first place data scientists look? Our virtual closets.

dressed in data wearables illustration

The Latest Look: Cataloging Your Style Through Data

In just the past two years, the ways in which people access clothing and style information online has changed the game. “Any image or video can easily be sent through a neural network,” Graves says, “to tell you everything about the garment, shape, silhouette, and detail.” Data may be further analyzed for time, season, and location, as well as waxing or waning popularity. More data points often mean more potential insights, but the key is finding quality information that deepens understanding of a solvable problem.

This is exactly what fashion marketplace Lyst is training its algorithms to do. Lyst looks at the words retailers attach to garments in addition to photos in an effort to demystify and refine the search terms we use when hunting for a specific style or look. “Language is crazy,” says Katy Lubin, Lyst’s public relations manager. “There are about 15 words for every jean style, so if you’re searching for ‘flare,’ you might be interested in ‘bell bottom.’ It’s a moving huge beast of a body of data.”

Artificially intelligent image crawlers and fashion-conscious search functions might be leaps and bounds ahead of yesterday’s offline style help, but they’re only the beginning. Today it’s our simple touchscreen taps and photos that are being reinterpreted as small points of data. Add enough of those points up between identifiable communities and you’ll start to see the future: predictive fashion.

Next Season and Beyond: Finding New Styles in Our Data

Today we often think of identifying data as a checklist of traits we’re born with (like race and gender), but self-projected information about who we are—and even our sense of style—could soon be part of a bigger data conversation that seamlessly connects the runway to Main Street. Previously unheard voices and their self-defined traits might soon become an important behind-the-scenes force driving the cutting edge of fashion.

“What if customers were connected because they’re all excited about wearing bright, bright colors with a statement necklace?” wonders Graves. “What if you could get that specific?” Brands will stop guessing how much to produce, Graves predicts, and instead ask “‘How can I make just enough for the right group?'”

Social exchanges of trends, fueled by the flow of data from everything from “Likes” to location-tagged photos, may soon create a world where machines automatically know who wore what where, as well as who wore it best. Where to find that same look, along with the list of local merchants riding the trend, could be just a voice command away.

Style-behavior data traveling between individuals can bring communities closer, as people increasingly share their culture through cataloged images and video—and even newer and more expressive evolutions of emojis and GIFs. Commerce will follow the data trail “to look at the non-explicit connections between people,” predicts Graves, and small and large fashion houses will easily be able to find interested subcultures through this feedback loop of connectivity.

As the conversation between the catwalk and niche communities continues to change, retail stores will soon undergo radical changes in response to a coming deluge of fashion data. Shoppers can expect to virtually ‘see’ clothes on them from wherever they are in a realistic 360-degree view. The concept of a narrow fitting room limited to store inventory may disappear thanks to AR imaging and the ‘endless aisles’ online selection. Choice, fit, and finish will be as vast or as limited as we like, allowing us to effortlessly explore that sweet spot where style meets imagination. Even the notion of store-bought clothes may morph—along with the garments themselves—as advanced materials stretch what’s possible in at-home printing.

When we venture outside to the newest brick-and-mortar stores, the main attraction will be an even more data-driven take on the newest showroom experiences. Retailers will reimagine and redesign the roles of the staff, sales floor, point-of-sale, and stockroom to inform the in-store experience based on a hidden trove of style and preference data that travels with us. In a sense, we’ll be able to try on whatever data can imagine without having to negotiate on-site inventories or the logistics of old-school limitations.

While good style is timeless, it’s the ways we achieve it that are always in flux. As different types of data and data science help evolve the fashion conversation, the possibilities for self-expression become endless. We’re embarking on a hunting and gathering mission when it comes to aggregating and understanding the data behind our style. As the quality of the data improves, the power to quickly identify emerging trends within communities will skyrocket. In this not-too-distant future we won’t just be able to have killer style, we’ll directly influence and predict it.


This content is produced by WIRED Brand Lab in collaboration with Western Digital Corporation

Forward-Looking Statements
Certain blog and other posts on this website may contain forward-looking statements, including statements relating to expectations for our product portfolio, the market for our products, product development efforts, and the capacities, capabilities, and applications of our products. These forward-looking statements are subject to risks and uncertainties that could cause actual results to differ materially from those expressed in the forward-looking statements, including development challenges or delays, supply chain and logistics issues, changes in markets, demand, global economic conditions and other risks and uncertainties listed in Western Digital Corporation’s most recent quarterly and annual reports filed with the Securities and Exchange Commission, to which your attention is directed. Readers are cautioned not to place undue reliance on these forward-looking statements and we undertake no obligation to update these forward-looking statements to reflect subsequent events or circumstances.