“`html

In her latest publication, Katharina Reinecke examines how “digital culture shock” appears in society, in ways benign and at times detrimental.Princeton University Press
“Culture shock” defines the intensity individuals can experience when suddenly surrounded by a new culture. The whirlwind of unfamiliar values, aesthetics, and language can confuse, distress, and isolate. In her latest work, “Digital Culture Shock,” Katharina Reinecke contends that technology can similarly impact individuals. Reinecke, a professor at the University of Washington in the Paul G. Allen School of Computer Science & Engineering, uses the term to “characterize the experience and effects of actively or passively utilizing technology that does not align with one’s cultural practices or standards.”
This book delves into how self-driving vehicles, trained on U.S. roads, would likely find it challenging to adapt to Cairo, which has significantly different driving customs. It investigates how Yahoo! Japan, with its intricate search interface, can overwhelm Americans accustomed to the simplicity of Google’s layout. Additionally, Reinecke explores how much technology emerging from specific areas, like the Bay Area, can result in forms of cultural dominance.
UW News engaged in a conversation with Reinecke regarding the book and how digital culture shock presents itself in society, in ways benign and occasionally harmful.
What was the catalyst that inspired this book?
Katharina Reinecke: Perhaps it was more of an embarrassment than a spark, but around two decades ago, I was in Rwanda developing an e-learning solution for agricultural consultants there. When I showcased the software I had created to several advisors, they courteously conveyed that they disliked its appearance and found it not user-friendly. I realized that my cultural background had shaped every minor design choice I made, such as whether the interface should be vibrant or merely monochrome, which I presumed most users would prefer; whether users should be guided through the application or allowed to navigate independently. The resolution to any of these inquiries hinges on a user’s upbringing, education, norms, and values.
Upon understanding that technology is never culturally neutral, I aimed to obtain a doctorate on this subject, and the rest is history. Throughout the years, I gathered numerous similar technological missteps. It turns out that, like me, many individuals are unaware that their culture shapes how they use technology and how they create it. It’s simply not something we typically contemplate or are taught.
Is there a specific instance of digital culture shock that stands out to you or is particularly illustrative? Why?
KR: AI is a prominent topic in the news currently, so let me begin there. When ChatGPT and other generative AI technologies emerged, it clearly showcased how their creators had made several design choices that allow these tools to function well for some, but not for all individuals. They are primarily trained on English-language data available online, which led early language models to proclaim things like “I love my country. I am proud to be an American” or “I was raised in a Christian household and went to church weekly.” Clearly, this would signal to many that the AI differs from themselves.
We discovered that the manner in which these language models communicate and the values they express only resonate with a minuscule portion of the global population, while others can perceive these interactions as a form of digital culture shock. This holds true for any AI application, from text-to-image models producing images of churches when asked for places of worship (as if churches are the sole viable response) to autonomous vehicles trained in the U.S., which would probably not thrive in regions where tuk-tuks and donkey carts share the roadway.
You mention that a majority of technology research is undertaken by and with individuals who are WEIRD, or Western, Educated, Industrial, Rich and Democratic. What are the dangers of a uniform digital culture that can arise from this?
KR: The primary danger is that technology will persist in being crafted in ways that suit individuals most akin to those within the largest tech centers, making it less user-friendly, intuitive, reliable, and welcoming for the others. This has ethical implications because technology should be equally accessible and beneficial for everyone, particularly considering the substantial profits of companies. There are various examples in my book that clearly demonstrate how technology products can struggle to achieve market presence in cultures they were not designed for, making this oversight risky for businesses.
As I elaborate in the book, digital technology has been criticized as a form of cultural imperialism due to its embedding of values and standards often misaligned with those of its users. This would be less problematic if technology were designed in diverse tech hubs across the globe, representing a variety of cultures and values. However, this is not the case. Most of the technology individuals utilize, regardless of their location, was created in the U.S., or influenced by user interface standards established in the U.S. Consequently, we find ourselves in a scenario where technology is gradually becoming homogenized, with optimal user experience occurring when individuals think and feel as its creators do.
You conclude the book with ten misconceptions about technology and culture. What is the most significant or impactful misconception?
KR: In my view, the greatest misconception is the belief that one size accommodates all. People create technology and anticipate it to suit everyone, which is evidently not the case.
For instance, the Western fixation on productivity and efficiency frequently comes at the cost of interpersonal connections. Numerous technology products are intensely concentrated on enhancing our efficiency. There’s an app for any of our “challenges,” all aiming to somehow improve our functionality, rapidity, and productivity. However, this singular focus on efficiency overlooks the reality that, in various cultures, productivity operates differently. In many East Asian societies, for instance, it requires time to develop relationships before people will trust another’s information — or that which is provided by AI. Thus, we must discard the misconception that technology design can be universal. My work would undoubtedly be simplified if people ceased to hold this belief!
For further details, contact Reinecke at [email protected].
“““html
“`