Ethical Design and Responsible AI with Kat Zhou of <Design Ethically>

Episode Notes:

My guest today is Kat Zhou, a senior product designer who created the <Design Ethically> toolkit that helps teams forecast the ethical consequences of their products. She also just finished her Master's degree exploring AI Ethics and Society at Cambridge University.

She talks about her origin story and the immigrant mindsets that were passed on to her from her parents and how she soon realized that education should be way more than just how to make money after graduation. In this episode, Kat provides thoughtful perspectives on AI and building tech responsibly through our discussion.

In this episode we cover:

  • (05:30) Her career pivot from pre-law to design

  • (08:37) Her growing up as a first-generation daughter of Chinese immigrants 

  • (14:21) Kat’s hot takes on generative AI and the policies and regulations she hopes to have in place to protect people’s privacy and security 

  • (19:36) The perpetual dark design patterns in our digital experiences and her work founding the Design Ethically project

  • (37:34) Overlooked human labor behind "automated" systems to unintended harm 

  • (41:24) The importance of interrogating the work we do and reflecting on what makes our jobs fulfilling and meaningful.

Kat leaves us inspired with a strong reminder that we have a role to play in our careers, loves and the work we do for the impact we want to create.

Referenced on the Episode: 

Silicon Valley Ep 01 - Minimum Viable Product

Excellent Sheep by William Deresiewicz

Behind The Screen by Sara Roberts 

Deceptive Patterns by Harry Brignull

Bullshit Jobs by David Graeber

Rest Is Resistance: A Manifesto by Tricia Hersey

Mentor: Nathaniel "Nax" Axios

Timnit Gebru

Design Ethically

Guest Bio: Kat Zhou

Kat (she/her) is the creator of the <Design Ethically> project, which started out as a framework for applying ethics to the design process and has now grown into a toolkit of speculative activities that help teams forecast the consequences of their products. Through her work with <Design Ethically> , she has spoken at events hosted by the European Parliament (2022) and the US Federal Trade Commission (2021), as well as an assortment of tech conferences. Kat has been quoted in the BBC, WIRED, Fast Company, Protocol, and Tech Policy Press.

Outside of <Design Ethically>, Kat has worked as a designer in the industry for years and is currently also pursuing a Masters degree in AI Ethics and Society at the University of Cambridge, with the Leverhulme Centre for the Future of Intelligence. She lives in Stockholm with her adopted husky, where you can find them running around.

Twitter/X | Website

  • Ep 03 - Ethical Design and Responsible AI with Katherine Zhou of <Design Ethically>

    [00:00:00] Kat Zhou: We use these like metrics as proxies of like, Oh, user engagement is, is good. It's high. So that means everyone's loving our product. Right. But then when you think about some of the products that have really high engagement, like cigarettes have great engagement, great retention, and it's not because it's a phenomenal product by any means.

    [00:00:16] Kat Zhou: And so, we in, in design and in tech, we borrow a lot from this kind of like behavioral. Economics. the school of thought where we, we actually apply these practices and it's normalized and it's encouraged. And we're like, yeah, we should totally change the way that people decide on things.

    [00:00:36] Ariba Jahan: Welcome to Up Next in Tech, a podcast diving into emerging innovations and experiences beyond the hype. Through exclusive conversations with the brilliant people exploring the intersection of emerging tech, responsible innovations, and how we live, play, and interact. I'm your host Ariba Jahan. My guest today is Catherine Zhou, a senior product designer who created the DesignEthically toolkit that helps teams forecast the ethical consequences of their products.

    [00:01:04] Ariba Jahan: She's also working on her master's degree. Flooring AI, ethics and society at Cambridge University. On this episode, Catherine provides thoughtful perspectives on AI from the hype to overlooked human labor behind automated systems to the unintended harm. We discuss building tech responsibly, the role of regulation and her hopes for community driven innovation.

    [00:01:26] Ariba Jahan: Let's dive in. Today we have Katherine Zhou. she wears many, many hats. she's a master's student at the University of Cambridge. Dialing in, zooming in from Stockholm. And she's also the creator of Design Ethically. But really, really excited to have you here, Kat.

    [00:01:43] Ariba Jahan: Thank you so much for making the time for this.

    [00:01:45] Kat Zhou: Thank you for having me everybody. It's so nice to be on this podcast. it's summer in Stockholm and I've been living here for three and a half years. So basically when summer comes around, you forget. the dismal nature of the rest of the year, which is the true Stockholm syndrome when you're like, Oh, it's actually nice to be here.

    [00:02:02] Kat Zhou: I love this place. It's not like it's cold nine months of the year.

    [00:02:06] Ariba Jahan: Like you just forget all of that.

    [00:02:08] Kat Zhou: So I completed a master's in artificial intelligence, ethics and society. And, it's really fascinating cause it's. the inaugural program that they had, on this topic.

    [00:02:18] Kat Zhou: And it was, hosted by the Leverhulme Center for the Future of Intelligence. And I've always been really keen on ethics, whether it was in design or just in general. and I've been curious about AI as well. And so this is like a cool marriage of, of disciplines. it was fascinating to be part of it.

    [00:02:36] Ariba Jahan: Backing up a little bit. tell us your origin story, because I think it's really interesting.

    [00:02:41] Kat Zhou: So, uh, yeah, I, I think for me, if I had to like really sum up who I am, I would, first of all, I would say I am the eldest daughter.

    [00:02:50] Kat Zhou: in an immigrant family, I think that says a lot about me. I, my parents moved to the U S when they were in their twenties to go to grad school and they moved to Minnesota of all places. So that's where I was born I grew up there with my siblings, and ended up, having a lot of curiosities when I was a kid.

    [00:03:08] Kat Zhou: and when I went to university, I, I thought I would actually be a lawyer. So I started out like the whole pre law track and I feel like any kid that's like, I want to be a lawyer was. Probably a little bit of an asshole when they were a kid, but I, I was like, I'm really argumentative. Like I love this.

    [00:03:25] Ariba Jahan: Um, when I prove to them that I have the receipts to be at right.

    [00:03:28] Kat Zhou: Exactly. Um, I was in the whole public policy ethics path for a while in undergrad, but then realized I

    actually missed the creative side as well. and so I switched to design like visual arts, halfway through my college years. And I ended up graduating with that.

    [00:03:46] Kat Zhou: and that I think kind of explains how I started the design ethically project, because I went into design super excited about it. and that was back when the way that we talked about tech and design in universities in the U S was still very much shaped by this like neoliberal logic of like, yeah, tech is the best.

    [00:04:06] Kat Zhou: Like it's going to solve all these problems. Tech is going to revolutionize the world. there's this episode in Silicon Valley, I think it's like the first episode when they're at the TechCrunch, festival or disrupt conference or whatever, and it's all these startup kids going on stage and being like, I have an app that's going to solve this, this, this, this, this, And nowadays, when we, we look at that, we're like, that's so unrealistic.

    [00:04:28] Kat Zhou: but that's actually like, I think that's what we were taught in some ways that when I was in uni, um, it's just this really positive outlook on the industry, on, on what design could do, on what engineering could do. and. It led to this kind of ethos of move fast, break things, you know, yada, and over the years we realized in the industry, we can't actually do this. And it's like techno solutionism, like idea doesn't work because it actually has disastrous consequences sometimes. and it was through my, my love of like, You know, the humanities and ethics and policy and stuff, combined with my love for design that I started this, this project.

    [00:05:05] Kat Zhou: I was really curious about what it meant to actually bring ethics into the design process, to interrogate what we were doing as designers, how we made things. So yeah, that's, that's me in that realm. as for where I am right now, I'm in Sweden. Uh, I moved here three and a half years ago by accident because I thought I applied to another office in New York city, but ended up applying here and it was a very happy accident.

    [00:05:30] Ariba Jahan: That pivot you made from law to design, right? I know around that time you realized like that's what you wanted to switch into, but what was that moment that told you like that's the switch you wanted to make?

    [00:05:40] Kat Zhou: That's a great question. There was actually a book and it was called Excellent Sheep by William, I'm going to butcher the last name, Derezuik, and I had read this book in college, actually right before, the year before I switched. And it was a critique of the way that American elite universities, a lot of like private universities, but also state schools as well, have completely transformed their education into this. It was very noticeable. Like you had kids that were either going to become consultants, investment bankers, pre law track, pre med track, comp sci folks.

    [00:06:15] Kat Zhou: and then like a handful of liberal arts kids or whatever that went and became a journalist. But the majority of students went in some of those first fields that I noted. And his, point is as like, these schools are just. these paths and creating these expectations that this is what you have to do to become successful and this is what you want to strive for as you, go to university and, and, proceed after you graduate.

    [00:06:40] Kat Zhou: And we have so many career fairs and we had, like, talks with the career office or whatever it was called. and it was such a big thing on our minds all the time. You were constantly getting ready for it. interviews for internships, let alone, you know, not even just jobs. I remember I had an interview process with LinkedIn when I was in college.

    [00:06:59] Kat Zhou: I didn't get the job. I didn't get the internship, I mean, but I had seven interviews for an internship. It was so intense and, you know, of course, not all of it is to blame on the universities that they also have a lot of outside factors like the companies themselves.

    [00:07:11] Ariba Jahan: well, it perpetuates, right? Like why? It perpetuates. One organization does it, and then it somehow becomes like, that's the best practice now, so now we're all going to do this, you know, and no one actually checks it. And if there's even a voice that says, hey, I don't think this should be a best practice, usually that voice is silenced by everyone else who's like, no, well, this organization does it, so it must be.

    [00:07:34] Kat Zhou: Exactly, and I remember like all the events that they'd have at university where they'd invite Oh, this person worked at McKinsey. They're going to come back and give us a talk about how they, you know, succeeded in life. and it was all these like business organizations that would kind of reinforce this.

    [00:07:49] Kat Zhou: but also like professors who would bring back like, you know, people that had worked and climbed up the industry and in these like huge corporations. And it was seen as this idolized pathway. Right. and so the author of this book is kind of criticizing that and saying like, and pointing to the fact that.

    [00:08:03] Kat Zhou: Education should really be about, like, learning to see differently and to, and to just like hold different truths in your mind and to, to really embrace things that you're not familiar with, to like, think about the world and different apologies and various ways, as opposed to like us all into these worker bees.

    [00:08:21] Kat Zhou: And sending us off into like the corporate world or whatever, whatever field it is. Right. when I read his words, I was so struck by that, cause I really did feel like a lot of my childhood was kind of crafted around conforming to that kind of narrative.

    [00:08:35] Kat Zhou: And even from like, when I was. High school,

    [00:08:37] Ariba Jahan: the immigrant mindset, right? Like it's, it aligns almost too perfectly. But you're first generation, right? So you absorb all the immigrant mindsets from your family. Yeah.

    [00:08:48] Kat Zhou: And it's, I understand where it comes from for them. Cause it's like a, it's a thing of safety. It's like, okay, this is how you can at least financially be safe in the society is. to go down these pathways. So I understand that, but it's it's one of those things when you realize, like, education should be way more than just how to make six figures after you graduate.

    [00:09:08]

    [00:09:08] Ariba Jahan: How have you redefined what success Looks like for you, because I think like that's been something that's been a very personal journey of mine where even if my parents weren't telling me explicitly, this is these are the metrics of success.

    [00:09:22] Ariba Jahan: We plan to measure you by you can kind of tell by the language they use and by what they say to relatives. The metrics of success that they're evaluating themselves by, and it's closely tied to the success of like what I'm able to do in my career, right? As a result of immigrating. So I'm curious, like from your upbringing and then this, you know, this sense of awareness that you've built, what is your redefined definition of success?

    [00:09:49] Kat Zhou: My wake up call was when I realized how gravely unhappy I was trying to follow that path. And that moment of unhappiness, I think there were two moments. One of them was after my first year in college when I had been like working so hard to.

    [00:10:03] Kat Zhou: Fit this kind of narrative and I was absolutely burnt out and the second, time that happened was right after college was when I like had taken a job that had followed the path like that. Everyone goes and I realized like this isn't it for me. Like, everyone's been saying this is the thing that you should do.

    [00:10:18] Kat Zhou: And I hated it. So I think that's, that's a good pulse check is like, are you happy following this? successful path that everyone talks about, and if you're not, then it's not success for you. and so in terms of like figuring out what was success for me, I think that just came a lot from just introspection and also, a lot of inspiration from.

    [00:10:41] Kat Zhou: Other writers and activists and people who have pondered these questions in history that maybe are not, you know, you're quote unquote mainstream or stereotypically successful person, these colleges or whatever society talks about. Right. And so I. got a lot of inspiration from people that had been doing like community organizing and people that are fighting for a cause that is just, it's bigger and more meaningful than just like their paycheck or their resume or whatever.

    [00:11:09] Kat Zhou: I think that's really inspiring to me. And I think for me, when I think about what is successful or what is. Fulfilling at least, that's a very, grating word to my ears, but what is fulfilling is like finding that kind of community that you can contribute to and that you can, help out with, helping the underdog, fighting for, for people fighting for vulnerable people.

    [00:11:30] Kat Zhou: think that is something that is what drives me. and I think it drives a lot of people, you know, I think that's an instinct that we all have, and we often are taught to minimize over time. But I think we all have that instinct inherently and we just need to like remember it and tap into it and there's so many inspirational folks to look after and to kind of like follow,

    [00:11:51] Ariba Jahan: you know, what you just said really hit a chord with me like the fact that that is the desire to help others and especially the underdogs or those that are already Under resourced or underserved.

    [00:12:02] Ariba Jahan: It's like that is an instinctual thing. I think what society and capitalism really has done is that like that is a nice to have. Right. Similarly to like what we've seen done to like any careers that are in the realm of world view. Business quote unquote business, you know, traditional, you know, McKinsey suit business versus everything else.

    [00:12:23] Ariba Jahan: That's like arts or creative expression. Those all get treated as nice to have or a hobby. You know, I think we keep seeing this perpetual pattern that Kind of keeps people being that worker be what will sustain and provide and profit capitalism versus everything else. If it doesn't have a direct tie in to capitalism, it feels like it gets shunned to the side and to your point, you know, it becomes the burden falls on people to like self educate to self discipline to understand, like, how do I use my voice?

    [00:12:55] Ariba Jahan: In the rooms that I'm in, at the tables that I have the privilege to sit at, to advocate, versus people in positions of power are not often asking those same questions, furthering this perpetual pattern of needing to constantly advocate, right, for the most vulnerable.

    [00:13:09] Ariba Jahan: I'm curious, how do your parents describe what you do?

    [00:13:13] Kat Zhou: That is an excellent question. I don't think they really know. I don't think they have a clear grasp on what I do. well, that's okay. Lots of questions.

    [00:13:26] Ariba Jahan: That's funny. I think only now my mom, she's heard me say certain words enough times so that she knows how to mirror the words, but not in the right context.

    [00:13:38] Ariba Jahan: So, like. She says things like, so how is web three? Like, it's a person, you know, you know, how are they doing, you know? And to me that's close enough. I'm like, that's great that we can, I can still answer this question. We can work with that. Yeah. Right. Right. Right. Um, so I know that you said you, you're pretty much wrapping up your, masters, right?

    [00:13:59] Ariba Jahan: Yeah. Okay. Yeah. So I know there isn't a day that we don't see. AI in the headlines, and I'm sure like when you first started your master's program, you had no idea this was about to happen. And maybe you or maybe you did. I would just love to hear like being someone who's deep in it from us studying perspective and understanding the implications perspective. Like what's your take on generative AI right now?

    [00:14:21] Kat Zhou: Yeah, that's a really, really interesting topic at the moment. And the way that I understand it. Kind of see it as through just this like lens of exploitation, and how much exploitation is happening, through these. Programs, right? Because the way that these models are working is that they're being trained on a lot of data, right?

    [00:14:39] Kat Zhou: And this data often that is training. These models is frequently obtained just by stealing them. You know, like, when you think about the generative, like, art programs and like the things that are creating these beautiful images and, you're like, wow, how did they make this baroque painting or whatnot?

    [00:14:57] Kat Zhou: It's because they've taken all these baroque paintings that actually do exist and just fed that into the model. and they are doing this with artists who have posted their work on, you know, art sites like dribble and whatnot. and these artists, they are not getting compensated for this normally. And they're scraping like variety of different platforms, uh, for this kind of content.

    [00:15:18] Kat Zhou: So they're not getting compensated for it. Oftentimes they don't even know that they've. Been stolen from in that sense, until they see like a trending piece that someone's tweeted about and they're like, what? That looks exactly like my thing. and it's funny because sometimes like you can even, they, some of these models will spit out an image and you can even see like fragments of an artist's signature that's been all jumbled up.

    [00:15:37] Kat Zhou: and so it's, it's interesting cause it's like. This is work that has been done that has been fed through to this model and is, is generating this like image that people are probably paying for. I know a lot of apps are charging money for this and none of the money is going back to the original artists that did the labor that was fed into the model.

    [00:15:56] Kat Zhou: And so there's this whole, like, obscuring of the work that they did. and they're not getting rightfully compensated. It's also like, how do you compensate that? Right? Like if, unless you're like explicitly saying, Hey, can I get art from you? And I'll pay you this much. Like if you're just scraping things, like there's.

    [00:16:12] Kat Zhou: you know, it's really hard, to, to follow up with everyone that you've just taken stuff from. so I think that's something that we really have to interrogate. It's just like, are we going to allow these companies to do that? Um, and it's not just artwork from artists that they're taking, they're taking like social media posts.

    [00:16:27] Kat Zhou: So stuff that we've written, that we've tweeted, that we've put on whatever platforms like that is also being used. and it makes you wonder, it's like, are you comfortable with that, being fed into these, to these, uh, models, even like when you're, when you're interacting with like chat GPT, for example, the stuff that you're typing in that can also be used as well. So it's a very gnarly, messy field for sure.

    [00:16:51] Ariba Jahan: I'm sure the work towards generative AI has been going on for like, eras. It just so happens that only recent years, it's been distributed to the masses where now everyone has a new vocabulary and a new tool, and they don't even know what questions to be asking or what questions to be thinking about.

    [00:17:08] Ariba Jahan: Could you break down, like, what's the simplest way someone can understand what generative AI is?

    [00:17:13] Kat Zhou: I would say it's, any kind of like, deep learning. Program takes an input like data, that could be text, images, whatever, and learns off of those existing, images or that input, in order to be able to spit something out.

    [00:17:29] Kat Zhou: That's quite similar. and so it's, it's almost like, it's a pattern based, like, type of logic in that sense. And it's fascinating because, you know, uh, I think a lot of times when we talk about it and like the common, vernacular, we, we think about it as a very like, Oh, it's very magical.

    [00:17:44] Kat Zhou: And it gets marketed as this very magical thing. but it is actually just taking stuff from. What exists around us. and it is very real in that sense. and so I think when we're thinking about who owns, you know, the IP that comes out of this stuff, that's when this kind of these kinds of questions come into play.

    [00:18:02] Ariba Jahan: so. What's happening is, these tech platforms or these tech organizations, they're creating systems that's able to scrape all that's available on the internet, and then learn off of those and then create algorithms based off of that learning models and then uh, The way that we interact with it is we might put in a prompt into chat GPT and we see an output.

    [00:18:25] Ariba Jahan: That output is based on what that model was able to learn from what already exists. And so what Kat is saying, that part isn't necessarily magic, even though it feels like magic. What it's doing is that it did. It's the chat GPT or the machine itself. Did it's due diligence that it was trained to do is learn off of existing work that other people have created.

    [00:18:47] Ariba Jahan: And even though what looks like an output, a magical output to us was actually a result of labor done by many, many other people that we're not able to credit or trace back to right? So, this question around IP and ownership, you know, we, we know, like, there is a few different lawsuits that are in the process that are in the works right now that has to do with, like, ownership and, like, figuring out, who does own it.

    [00:19:12] Ariba Jahan: And I, is it the original artists? is that the person who created, created the prompt? Is it the technology? And how do you even have shared ownership? and I know when we were talking earlier, uh, you talked about, Policies and how eu how you policies are used approach to tech regulation is so different from the U.

    [00:19:30] Ariba Jahan: S. Like I frankly just don't know that much about the policy differences would love to hear from you. Like what are you noticing?

    [00:19:36] Kat Zhou: so I'm not a legal expert by any means, but just having lived over here seen how when I go on to My phone in the EU my user experience is very different from when I used to go my phone back in the States First of all, there's cookie banners everywhere.

    [00:19:52] Kat Zhou: Like every set you go on to you have to exit out of this banner It's like are you okay with cookies? And as far as I could recall that wasn't a thing in the States Now it is now. It is. Okay. Yeah, now it is. Okay interesting Yeah, that was I noticed that like three and a half years ago when I moved over here.

    [00:20:08] Kat Zhou: I was like This is, it was a lot. It was very overwhelming to go and see that on every site, but I guess the intent behind it was in the right path. and I also noticed like, for example, some apps like on, I think it's Instagram. I. Don't have some of the functionality that is available to users in the U.

    [00:20:25] Kat Zhou: S. Um, and there are specific things that I can't do, like I think around like messaging or something. it's interesting as I remember Instagram had a little. notice, banner a while back saying, Hey, because you are a user in the EU, you cannot actually access this feature where we can read this FAQ for more information.

    [00:20:44] Kat Zhou: Uh, and it was basically explaining like that there was a certain law that this feature did not really pass in

    [00:20:50] Kat Zhou: the EU. What feature was it? It's something

    [00:20:53] Kat Zhou: with messages and like responding to certain, I don't remember exactly. It was something that didn't really. Matter that much to me. but yeah, it's, it's interesting.

    [00:21:01] Kat Zhou: and I think like having, you know, spoken a lot about like deceptive design patterns and manipulative design patterns. I've noticed, for example, in the EU, these. Patterns that are really annoying and prevalent on like all of our e commerce sites, our games, you know, a lot of shopping, like ads and stuff, these are incorporated, like they're, they're acknowledged in like, for example, the, digital services act in the EU, uh, which is pretty new and it's going to be like fully applicable in May.

    [00:21:29] Kat Zhou: this year. I mean, And these patterns, like they have been kind of acknowledged in the U. S. like, for example, by the Federal Trade Commission over there. but there hasn't been that kind of same like sweeping law that's been passed, about that. And I think from what I can tell, like the U.

    [00:21:43] Kat Zhou: S. is very much kind of like this hands off. has his hands off approach where they're like, okay, we don't want to stifle innovation. So we're just not gonna do that much regulation because companies need to be like very competitive and yada, yada, yada. whereas in Europe, they're a lot more okay with just being quite strict about this is not allowed.

    [00:22:02] Kat Zhou: This is an infringement on like your right to privacy, or this is an infringement on your right to, transparency or whatever. So we're going to ban this and if you fail to comply, you're going to get fined a lot of money and we've seen a lot of like big tech companies, pay those fines because of their violations in Europe.

    [00:22:20] Kat Zhou: and even like around AI as well, like I, it's funny cause, I think China just released some of their AI policies, a month ago or something and they were. Actually a stricter than the ones that America has. Like America doesn't have that much. And it's just interesting to see how, how lax it is in the States.

    [00:22:37] Kat Zhou: And that actually is a bit of a problem as well, because it's like these companies that we, you know. And the products that we use from these companies, they're being used in all different continents and all different markets. So it's not like they're just used in the States, right? American tech companies, their products are being used in Europe and Asia and yada, yada, yada.

    [00:22:56] Kat Zhou: And so it's like when, when there's this like. when one region is regulated in this kind of lax way and another is in this more stringent way, that poses this kind of, messy experience, which, as, as I mentioned with the Instagram thing, it can, it can. Manifest like that.

    [00:23:11] Ariba Jahan: I've never thought about it that way.

    [00:23:13] Ariba Jahan: I don't think I've thought about the global, the nuanced differences between the global experience. U. S. Experiences of different technology. I think when you started noticing the difference between, like, Different UX, of American or like U. S.

    [00:23:27] Ariba Jahan: based experiences versus EU. And now, just hearing even about like China has a more stricter

    policy. Like what are some, I guess, policy or changes that you wish U. S. would do?

    [00:23:38] Kat Zhou: I think they should be a lot more proactive about just like making certain practices not legal. For example, certain deceptive design patterns.

    [00:23:49] Kat Zhou: That are specifically meant to just deceive people to pay more money or, give more of their time or data to these companies. Some of these practices, they're like never good. They're, they're only like meant for this kind of malicious or annoying purpose and they need to be outlawed. and, and certain, you know, risky applications of AI, need to be really restricted.

    [00:24:11] Kat Zhou: and I think like, you know, we always talk about it as if it's like, oh, but we're not going to be able to innovate and oh, but sometimes you need to do this kind of stuff. Or you need to allow like these kinds of slightly unethical things in order for like this innovation to happen and in order for, in order for businesses to be able to succeed.

    [00:24:27] Kat Zhou: And I, I think that's. Kind of bogus. you know, it's like, if you think about like your local baker in your neighborhood and the baker was like, Oh, well, I have to put a little bit of heroin in my croissants because otherwise, like, how else would I be competitive? Like, no, you don't. If you have to do that, you shouldn't be a baker.

    [00:24:45] Kat Zhou: and so I think we need to start being Yeah. Yeah. Yeah. Yeah, more, more strict about that because the consequences are huge, right? This is not just impacting us as adults. It's impacting children. It's impacting elderly folks, you know, vulnerable people who, yeah, when they have to like lose money because of this Particular pattern or whatnot,

    [00:25:05] Ariba Jahan: is there, are there some patterns that you're thinking of as you're saying this, just so like when people are listening, they can look out for those patterns as well.

    [00:25:12] Kat Zhou: Well, there was a really common example before it's, it's started being, it's been shamed a lot more, due to like Harry Bricknell's work with. Calling them out. one of them is the whole like click to subscribe, but call it a cancel. So I know the New York Times did this, as well as a couple other sites.

    [00:25:30] Kat Zhou: But yeah, you could easily buy a subscription to some kind of service or a newsletter or whatever and you just did that online, but then in order to cancel it you had to like hop on a flight. phone call with someone like who is only available from like the hours of like 11pm to like 1pm and if you miss the window like you're out of luck, which is so annoying.

    [00:25:50] Kat Zhou: And I also like hate phone calls anyway. So, yeah, it's just like, that's really annoying. There's also the slightly less like egregious, but, Really snarky thing that you see a lot in like, those pop ups when they're like, do you want to subscribe to this newsletter? And you can either say like, yes, or you can say, no, I just want to lose money or no, I'm a loser or something like

    [00:26:12] Ariba Jahan: something really snarky.

    [00:26:13] Ariba Jahan: And it's boring.

    [00:26:14] Kat Zhou: Yeah. And that's a form of emotional, it's an emotional kind of, manipulation. Exactly. Yeah. So it's really annoying and they don't need to be like that, but, and also, for example, another one that I really hate that I still see sometimes is the whole like, Oh, there's five other people with this item in their cart.

    [00:26:32] Kat Zhou: A lot of times it's not true at all. They're just saying that they put a number up there in order to get entice you to feel like, Oh, there's some competition. I got to buy this and it's, it's creating this kind of artificial demand scarcity.

    [00:26:45] Kat Zhou: Yeah. which is really problematic. and you know, it's, it's, it's even more problematic when you think like, okay, these practices are happening for us, but they're also happening for kids, for example, on education apps. And I know at the federal trade commission event, around deceptive design patterns in 2021, or as big.

    [00:27:04] Kat Zhou: call it back then, dark patterns. it had a panel on it after the panel that I was on, but the panel was about, deceptive design patterns in like educational games Duolingo uses a lot of these patterns and Duolingo is used in a lot of like by schools and stuff.

    [00:27:21] Kat Zhou: And there's also other. education platforms that kids are using that have these kinds of things.

    [00:27:25] Ariba Jahan: Um, So those, those may not be about like purchasing, right? Like on Duolingo, is it like just gamifying, trying to like over gamify like learning?

    [00:27:35] Kat Zhou: It's like, it's the over gamification of learning, but also So it is the purchasing stuff because I remember when I was, when I was trying to learn Swedish, through Duolingo.

    [00:27:44] Kat Zhou: And this reveals a lot because I clearly was not learning it for the right reasons. I was just learning it for the streak on Duolingo. and I made it to like 140 days and then I missed a day. And then Duolingo had the audacity to be like, do you want to pay 10? to reinstate your streak. And I thought about it for, they got me.

    [00:28:05] Kat Zhou: I almost did. I thought about it for two seconds and I was like, this is absurd. So imagine like a kid, but

    [00:28:11] Ariba Jahan: they understood, you know, on a human level, you want to maintain that badge that you've earned. Oh, wow.

    [00:28:19] Kat Zhou: Exactly. And yeah, a lot of companies do this. And so it's just like, that's so silly. Like. If you're trying to teach kids or adults or whomever the language to gamify it to that extent, just...

    [00:28:34] Kat Zhou: I mean, I understand, I can understand why they decided to do that, but I don't think it's right.

    [00:28:38] Ariba Jahan: So, yeah. Just because something can be a KPI doesn't mean it needs to be. Exactly,

    [00:28:43] Kat Zhou: exactly. And that's the thing, it's just like, in our industry, our KPIs, our OKRs, whatever acronym we use about these things, they're always focused on the bottom line.

    [00:28:54] Kat Zhou: And we're always, we're in this like growth driven industry that's only focused on like, okay, short term growth. How do we get it? What can we do, you know, in the realm of possibility to allow us to get to like 10 percent increase in whatever. You know, it's a sad way to kind of operate.

    [00:29:09] Kat Zhou: And we, we use these like metrics as proxies of like, oh, user engagement is, is good, is high. So that means everyone's loving your product, right? But then when you think about some of the products that have really high engagement, like cigarettes have great engagement, great retention. And it's not because it's a phenomenal product by any means.

    [00:29:27] Kat Zhou: And so, , in design and in tech, we borrow a lot from this kind of like behavioral. Economics, the school of thought where we, we actually apply these practices and it's normalized and it's encouraged. and we're like, yeah, we should totally change the way that people decide on things and mold them to become loyal customers in that way.

    [00:29:46] Kat Zhou: Uh, when it's kind of patronizing and it's, it's like paternalistic and, it can be really problematic

    [00:29:54] Ariba Jahan: at times. Going back to our first conversation around like success, right? And then just bringing, bringing that back here, I feel like a lot of times, whether it's tech or not, businesses often, you know, get, birthed from the desire to solve problems from the desire to create value.

    [00:30:10] Ariba Jahan: And then I think somewhere down the line, whether it's because of VC funding or like surviving in the market or, Bringing in revenue, it becomes more about like what will impact those bottom lines. And then, thinking about that bottom line, but to an extent where you're kind of losing sight of everything else.

    [00:30:27] Ariba Jahan: over OKR, over KPI it to the point where you're not really seeing the bigger picture of what are all the Not side effects, but like direct effects. You're creating while you're also creating, or getting to your KPIs. I'm curious, you know, from your masters and your focus on AI. Like, What are some, policies that you're hoping gets created?

    [00:30:50] Ariba Jahan: Because right now there the generative AI space is proliferating faster than any policy. So I'm curious, like from your perspective, whether it's policy itself or just guardrails, I'm curious what. would you like to see in place?

    [00:31:03] Kat Zhou: That is a great question. And I think there are some good stepping stones or starting points that, you know, are happening in like the EU and the UK of a really clearly defining like what is harmful AI, like high risk AI and, To ensure that those are not legal, to, you know, to ensure that the dissemination of that kind of product, or the creation of that kind of product is not okay.

    [00:31:27] Kat Zhou: and I think that's, we need to see more of that. We need to see more strict regulation. in terms of like a regulation aside, what I would really want to see is just like. More collaboration between, people in policy and those who are building these products, which we have been seeing more and more, definitely, but also between you know, the organizers in communities, right?

    [00:31:47] Kat Zhou: Like, not just. Folks in tech, but people who are just your, your everyday citizen, who is also very much impacted by this technology, but maybe doesn't have a kind of platform to talk about it necessarily. so yeah, I think it's, it's a very tricky issue because in some ways, like. Pandora's box has already been opened.

    [00:32:06] Kat Zhou: and a lot of these, technologies were open source. Like people can just kind of take it and run with it and, and make their own spinoff of it. And that's tricky. It is really tricky. and I don't necessarily know if I have like the regulatory panacea to that. but I know that we just need to, especially in the U S we need to do a lot more and we need to act a lot faster.

    [00:32:24] Kat Zhou: I think one of the things that you noted is that we're. Already so far behind like the developments that's happening in this space. and that's been a very big pattern in all of the kind of regulatory, history. I think in the U. S. for any of our

    [00:32:38] Ariba Jahan: fields, especially tech, you know, when we started talking about AI, you're really thinking of it through the lens of different aspects.

    [00:32:44] Ariba Jahan: I think like, I totally understand what you mean, and especially because there's not a lot of threading and giving credit and acknowledging, and erasing historical work, right? And oftentimes the people that get erased the most are the people that are the most marginalized. Yes. That whole notion aside, especially because I agree with it, just going to put it aside for a second and just think about like, you know, you just studied, you just did a whole masters on artificial intelligence and and its implications and all the things.

    [00:33:13] Ariba Jahan: I'm curious, like, what are the, potentials of this technology of AI that does give you some hope? That keeps you inspired. It doesn't have to be about how it's being utilized right now. But like, I'm curious from your perspective, what about AI keeps you inspired, keeps you engaged in it?

    [00:33:30] Kat Zhou: Yeah, I think there are definitely use cases where this technology can be applied. In a way that is not, you know, susceptible to the capitalist market and the logics that come with it. And I've heard about, you know, AI technology being used in more grassroots capacities. I do not remember what this is called, but I know I saw a tweet about this a while back about group that was using technology to kind of preserve indigenous languages in this, I think it was in New Zealand or something.

    [00:34:02] Kat Zhou: it was community driven. It was not like really, so far as I could tell, it was not beholden to any like DC. Interests or or steering or anything. So I think that is really cool. I think there are, there's a lot of potential for these technologies, emerging technologies to be used by us, the community in a way that's a community driven and not driven by.

    [00:34:23] Kat Zhou: Investors who are only in it for the money, right? yeah, we can, if we can own this technology ourselves and use it to build things that are genuinely helpful for us, that is wonderful. and I, I want to like kind of distinguish like that strand of, applied technology that's helpful for the community from the kind of the, the privatized technology that we've seen, like in, you know, ed tech or med tech or whatever.

    [00:34:46] Kat Zhou: and a lot of these companies, these private companies are coming to like Europe, for example, and being like, we're a private med tech company and we're going to help you all. but in actuality, they're kind of chipping away, like their business model is chipping away on the public, infrastructure that.

    [00:35:02] Kat Zhou: Is already there. and it's their, their business model is contingent on the failure of these public services provided by the government and communities and whatnot. so I want to distinguish that kind of, the technology that's being applied for the community in some ways, but from a privatized capacity.

    [00:35:17] Kat Zhou: And so I think that is what gives me hope that the former example of what I talked about of technology that's owned by the people that is not, at the whims of. this market or of what investors want to say all the time. and that can really just be allowed to, serve what we need as, as people, as, as neighborhoods, as you know, collect the communities and whatnot.

    [00:35:37] Ariba Jahan: Yeah, I mean, I just started Googling the example you said about, AI being utilized for to preserve indigenous languages and it's like preservation. it's so the ante of what we're seeing, right? of how generative AI, especially is being utilized, right? It's almost, it's the complete opposite because I I would feel like when the words we were using earlier is that it could perpetuate erasure.

    [00:36:02] Ariba Jahan: Yeah. monolithic perceptions, creating a homogeneous narrative. was your master's inclusive of generative AI?

    [00:36:09] Kat Zhou: Yeah, we talked about that as well. We covered like the whole gamut we, for the very end, we could kind of specialize in what we wanted to hone in on.

    [00:36:17] Kat Zhou: And so I focused on algorithmic content moderation. Oh, same more. Yeah. So, you know, if you use. Any social media platform, right? You're seeing content from your friends, from whoever you follow. and you're not seeing, I hope at least a lot of scary content that's violent or, you know, creepy or like depressing, at least a lot of the content that is really egregious has been taken off and it's taken off before you even.

    [00:36:43] Kat Zhou: Get a chance to see it. and there is someone out there whose job it is to sit at their desk nine hours a day and remove all of those videos and to kind of vet them all. So the platform is safe for users like us, right? And this job, for a very long time was done by humans. it still is. And, nowadays, like in the last few years, they've just seen this massive shift towards companies that have like AI or algorithmic content moderation where it's all or not all, but a lot of it is marketed as automated, and it is kind of reinforcing this narrative that like, yeah, we don't really, we're cutting out humans from this, so that first of all, they don't have to do this work, but also we can save money, yada, yada, yada.

    [00:37:24] Kat Zhou: and this, this is kind of a misleading, it's, it's like, Presentation of, algorithmic content moderation, because in reality, it still is very much reliant on human labor and the writer, and filmmaker Astrid Taylor wrote about this concept in logic magazine a while back c So F A U X timation and what it encompasses is, is the idea of like, Oh, we, we market and we have all these gimmicky, like AI, this, everything's automated, but it actually belies the human work that's going on behind the scenes. And so these algorithmic content moderation companies, they're claiming that they have all these automated systems. And yet in order to even like run the models, Are doing the content moderation, they have to have people train those models and that training process is extensive and it's just like someone again has to sit there and go through all of this footage and be like, annotate, this is, this is something that depicts, you know, suicide or, you know, violence in that capacity or whatnot.

    [00:38:21] Kat Zhou: And so it's still happening and even worse. And, you know, beyond that, to make it worse. this labor is being distributed in a way that's very reminiscent of like colonial patterns, and, uh, of exploitation. Right? And so you see a lot of these. workers that are doing this kind of work are based in Kenya or the Philippines or, Columbia, et cetera, and in places that have historically also been very much exploited by Western companies and traumatized and traumatized.

    [00:38:48] Kat Zhou: So we're, we're offsetting this emotional trauma, this emotional labor, and it's, you know, to benefit a lot of these companies that are based in the West. So it is this very unfortunate. Tricky dynamic. that is happening right now. And it's something that like we don't really think about it because we don't see it happening and it's

    [00:39:08] Ariba Jahan: completely invisible,

    [00:39:10] Kat Zhou: right?

    [00:39:10] Kat Zhou: It's being erased. Yeah.

    [00:39:11] Ariba Jahan: for artificial content management, Besides the practice that's being done right now, like how do you hope it becomes used? Cause I think, you know, the, the, the thing here is like what you said earlier, the Pandora's box has been opened, like AI isn't going to go back and we're not going to stop using generative AI either.

    [00:39:30] Ariba Jahan: I think policies might change. Policies might start regulating what gets done. and I'm curious as it relates to a I being used for content management. Like what's a change you want to see?

    [00:39:41] Kat Zhou: That's a great question. I think part of it has to do with, just better rights for these workers.

    [00:39:47] Kat Zhou: That are, you know, working for pennies. and I know recently, in Africa, the first African like content moderators union Was created, I think, like a month ago, because there was this, company called Sama AI, which is a B Corp company, that for a while we're hiring contractors in Kenya to provide content moderation for Facebook, and, open AI as well.

    [00:40:11] Kat Zhou: and these workers were getting paid. Um, so little and they're like mental health policies were not great at all at the company. And then when workers tried to speak out about it, there was a lot of retaliation. and one of the, the former colleagues who was working in the Kenyan office. Daniel Moton, he actually ended up suing, Sama and Meta.

    [00:40:31] Kat Zhou: And so it was a lot of, I know, organizing that was happening around that time. And finally they formed this union. And I think that's such an encouraging step forward. And I hope to see more of these companies. Compensating these people. And also another thing that I'm thinking about too, is just in, in Sarah Roberts book, um, behind the screen, which is, it's about content moderation.

    [00:40:54] Kat Zhou: in the end she talks about, she's like, well, one obvious solution obviously is to just limit the amount of content that we have on these platforms. because right now the problem, the problem is that we're just inundated with content, right? of course we could just limit it, cap it, and then She astutely observes that that is very hard to do and if they do care about it, it's mostly to just cover their own, to prevent themselves from getting into trouble, with compliance stuff.

    [00:41:19] Ariba Jahan: What's, what's something you wish more people knew or used or understood?

    [00:41:24] Kat Zhou: Yeah. I think maybe something that's been on my mind that I've been just having a lot of conversations with my friends about is just the, the unlearning that I think we're all doing. Yeah. Right now about bringing it back to almost this like success thing that we talked about. Or about like, you know, our jobs and about meaning. and I think. for me, this really transformative book that I read recently was by David Graber, who passed away a couple of years ago called bullshit jobs.

    [00:41:49] Kat Zhou: and he's a brilliant anthropologist. and he was writing about how a lot of these jobs that we have, are. Just bullshit. They're, they're pointless. And he, he cites like people like corporate, you know, lawyers and hedge fund managers and, and like software engineers and some capacity. And, he contrasts that with like the important people who are teachers and, people that work in, like as nurses and basically everyone that was a frontline employee during the pandemic.

    [00:42:15] Kat Zhou: Right. and we really saw that he wrote this book before the pandemic happened, but we really saw that distinguishing like line between who's actually providing meaningful work. That would make a difference, to our day to day, you know, existence and whose jobs were kind of just like there. And I think, one of the things that my friends and I've been talking about recently, just given the whole state of like the tech world, right?

    [00:42:35] Kat Zhou: With the state of layoffs and like, A lot of, a lot of workers, a lot of employees at these companies who thought they were, in this family and in this like, you know, amazing atmosphere that like wanted to care about everything and yada, yada, yada, but then being laid off like that, you know, I think it's just a wake up call for a lot of people like, okay, these jobs are jobs and they're going to help you pay your paycheck, but, you know, it's not really anything more than that.

    [00:43:03] Kat Zhou: And sometimes it's worse than that, because you're working somewhere that might be causing problems. But yeah, that's what's been on my mind and I think, for me, I encourage people to kind of interrogate the work that they're doing. and to interrogate the kinds of ideas of success that they might have.

    [00:43:18] Kat Zhou: that, you know, where do they come from? and what makes your job fulfilling and meaningful as opposed to a bullshit job? And if it is a bullshit job, what path would you actually want

    [00:43:28] Ariba Jahan: to take? I mean, it's such a tough tension, right?

    [00:43:30] Ariba Jahan: Because to your point, in order to even interrogate that, you have to do enough unlearning to even realize that that might be a situation you're in. And then I think the role of privilege comes in, because I think like, even if you have a bullshit job, you may not have the privilege to choose otherwise, right?

    [00:43:47] Ariba Jahan: Like I think your, your cultural upbringing, your household upbringing, plus the privilege or lack thereof, all of those kind of have a huge effect on the decisions that might come out of that interrogation. Okay, so I'm going to do a rapid fire. Yes, so I'm gonna do a bunch of questions, and will just answer them Okay, are you an introvert or an extrovert

    [00:44:10] Kat Zhou: introvert?

    [00:44:11] Ariba Jahan: What helps you stay curious?

    [00:44:13] Kat Zhou: Reading

    [00:44:14] Ariba Jahan: how do you nurture play in your life right now?

    [00:44:18] Kat Zhou: I play with my dog a lot, and I go outside. Okay.

    [00:44:22] Ariba Jahan: What's one thing that you're deeply grateful for right now?

    [00:44:25] Kat Zhou: My friends and family.

    [00:44:26] Ariba Jahan: Are they, spread out across different time zones?

    [00:44:30] Kat Zhou: Yeah, different time zones, different continents. but even my friends in Stockholm, I'm really grateful for them.

    [00:44:36] Ariba Jahan: If you could go back and give that 18 year old self one piece of advice, whether that 18 year old... I don't know if that 18 year old was pursuing law or had already courses, but what's one piece of advice you would give that Kat?

    [00:44:52] Kat Zhou: Follow your gut. listen to your body and listen to your feelings, because they reveal the truth to you often way earlier than your, your logic does.

    [00:45:03] Ariba Jahan: What is something you've been doing recently to nurture your mental health?

    [00:45:07] Kat Zhou: Resting a lot. I recently read, Tricia Hersey's book, Rest is Resistance, and I am trying to internalize it as much as I can.

    [00:45:15] Ariba Jahan: What helps you rest? Like, what helps you, make sure you are resting? They're like a boundary you're setting, like, yeah.

    [00:45:24] Kat Zhou: Yeah, definitely boundaries with, you know, work of like, when am I closing my laptop? sometimes even penciling in time, like I'm going to go. To the park and spend time with a friend for this amount of time and, and making sure that I do it and I don't fill it up with errands or anything. Yeah.

    [00:45:40] Ariba Jahan: what's something you couldn't do without in your career? It can be anything from a routine, a person, a service or an object.

    [00:45:48] Kat Zhou: What's something that I couldn't do in my career?

    [00:45:50] Ariba Jahan: What's something you couldn't do without in your career like it had a significant impact in your career a Routine or a person or a service an object.

    [00:46:01] Kat Zhou: Oh That is a great question. I think I want to credit my first mentor actually he really Pushed me to kind of interrogate some of these systems in which we work and operate in. and he was largely the reason why I created the Design Ethically Project. Yeah, his name is Nathaniel Axius or he goes by Nax.

    [00:46:20] Kat Zhou: He's awesome.

    [00:46:21] Ariba Jahan: if you weren't in the role that you're in right now, what would you be doing? And when you answer that, maybe share what you're doing.

    [00:46:28] Kat Zhou: That's funny. I actually love that this is how it panned out because I'm realizing more and more, like, when we talk about each other, like ourselves, especially in the US, the first thing you say about yourself is, Hi, my name is Kat.

    [00:46:41] Kat Zhou: And what do I do? This is who I am. And why is that the defining question? Why is that the defining piece of information about who we are as a human? yeah, it's wild. So I appreciate that that actually didn't come up at all, but I am a senior product designer in a tech company, in Stockholm. And, if I wasn't doing that, I think I might just be in school full time.

    [00:47:02] Kat Zhou: I. have loved my master's program, and it's made me realize how much I just love the process of like researching, writing, even teaching at times. Like, it's so fun.

    [00:47:12] Ariba Jahan: You know, this podcast is called Up Next in Tech, and I'm curious, what do you think is up next for us as we think about artificial intelligence, specifically, uh, generative AI?

    [00:47:24] Kat Zhou: So I think, kind of like what we were saying before about exploitation, I think we're going to see more

    of it, unfortunately. Okay. I think that's, it's just the natural course of how our current system works. But what I hope to see is more of that kind of community organizing of people coming together and saying, this is not okay.

    [00:47:44] Kat Zhou: speaking out, whistle blowing, grouping together, unionizing, you know, that is so cool. I joined my first union last spring, in the union culture in Sweden, also very different from the U S like it's a lot more extensive here. Almost everyone's in a union, basically, and, you know, in the U. S.

    [00:48:01] Kat Zhou: We're just starting to hear about unions forming at some of these tech companies, which is really exciting. And I want to see more of that. I want to see people come together in solidarity to demand for rights, not only for themselves, but for, you know, the environment and for, just all of the people that are affected by the technology that we're making.

    [00:48:18] Kat Zhou: So, yeah, yeah.

    [00:48:19] Ariba Jahan: You know, as you said that, it made me think, like, even the concept of unionizing, has gone through the same impact that we've been talking about how, like, due to capitalism and the U. S. centric culture, unions get a bad rep, right? Like, you you unionize because something is bad, right?

    [00:48:34] Ariba Jahan: And it must be so bad that we have to unionize versus what I'm hearing from you about EU is like, no, everyone should be in a union. Yes. Advocating and understanding your rights is just like. An expectation versus in the U S I think the expectation is that you just are a worker be, you know, you are loyal to the company that you work in.

    [00:48:53] Ariba Jahan: You think it's a family, you know, all those things we just talked about. So it's just interesting. I don't think I've ever thought about unions under that same, umbrella too.

    [00:49:01] Kat Zhou: And there's a reason for that. Through like the Reagan era, and also in the UK during the Thatcher

    era, a lot of policies like neoliberal policies that were crafted to inherit, to give unions a bad rep and to make them seem more outdated.

    [00:49:16] Kat Zhou: And like, they. It really eroded the amount of unions that existed and also like in tech, for example, we never had unions in the beginning and they, they kind of use this whole guise of like, Oh, you work in tech. You have a foosball table. You have swag. You have free meals. Why are you complaining? to, to kind of serve as a distraction, but you are allowed to complain.

    [00:49:34] Kat Zhou: Like we, as workers, we are allowed to speak up and, and ask for, um, Better treatment because when we can do that, it makes it better for everybody. so yeah, I think it's important to not forget that kind of history that we have. and how unions, like, had they not been eroded by intentional planning and policymaking, they would probably still be around and a lot higher numbers.

    [00:49:54] Ariba Jahan: Yeah. I feel like now I'm really curious about like the state of unions in the U. S. Like what companies have unions and. I'm gonna hopefully not butcher their name, but there's a DAO a decentralized autonomous organization that's all about labor unions, and they are, trying to see whether or not they could use the technology behind Dow infrastructure to, mobilize more union work.

    [00:50:20] Ariba Jahan: that is the extent. Of my knowledge on that topic. Um, but it is just something that I remember seeing and I was like, really intrigued. so that's, what's up next. who do you think is up next? Who is someone we should all be watching and learning from?

    [00:50:35] Kat Zhou: There's so many people. , Timnit Gebru is obviously a big one for me. She's so inspiring. Trisha Hersey, who I mentioned, she doesn't work in tech, but she is a really great, influence in the way that I've been thinking about rest and work and grind culture. She's been speaking out about that a lot, and kind of playing into this idea of like deprogramming ourselves from this.

    [00:50:56] Kat Zhou: hyped up culture as I, those are the two that come to my mind at the moment.

    [00:51:00] Ariba Jahan: So as we kind of wrap up and close out, you know, where can people go to learn more about you or support your work?

    [00:51:07] Kat Zhou: Yeah. So there's a designethically. com and it's a free. framework and toolkit on that website. Um, so if you're ever curious about ways that your product team, if you work in tech, can better intervene or interrogate what you're building.

    [00:51:22] Kat Zhou: there's some exercises that you can do in, in that. website. but yeah, otherwise like, you can follow me on Twitter. I have been doing, I'm mostly like a retweeter nowadays because it's honestly too much emotional and, and, you know, labor to kind of dive into the other stuff, but yeah,

    [00:51:37] Ariba Jahan: Okay, you know, I know you're wrapping up your master's right now.

    [00:51:40] Ariba Jahan: What can people expect from you next, if you have that answer or what are you exploring next?

    [00:51:47] Kat Zhou: I, I think I want to continue on learning more and I have no idea what that's gonna look like. I've realized I'm not a person that does the five year plan, as in I have tried many a five year plan. I've tried drafting many of them, and I never fulfill 'em.

    [00:52:00] Kat Zhou: Uh, so I, I realized I just need to go with the flow and see what happens. 'cause for example, my five-year plan from five years ago did not include moving to Sweden, so or going to grad school. So, yeah. Yeah.

    [00:52:12] Ariba Jahan: I know we covered a lot of different topics. I'm curious, is there anything else that we haven't covered that you really wanna say before we close out?

    [00:52:19] Kat Zhou: I think, for me, just, I would say, keep on learning and unlearning. there's so much to learn about the world that's just not in our bubbles that we currently live in. And there's a lot to unlearn, a lot to kind of drop and, let go just from things from the past that maybe aren't true anymore.

    [00:52:38] Kat Zhou: or we're never true to begin with. And I think it's a very, it can be a very uncomfortable process, but to be okay with it and to be forgiving, to keep on trying at it.

    [00:52:45] Ariba Jahan: I love that. I think, you know, unlearning looks so different in every stage of our lives. And you might unlearn something new today and then have to unlearn that in a different context three years from today.

    [00:52:59] Ariba Jahan: Because it shows up so differently in different contexts. Um, so I really love that. Thank you so much Kat for spending this time with me, for being on this podcast. Really appreciate you sharing. All the things, your journey, your experience, your point of view, um, and yeah, thank you. This has been

    [00:53:18] Ariba Jahan: so fun and I hope you have a wonderful, wonderful

    [00:53:21] Ariba Jahan: weekend.

    [00:53:22] Ariba Jahan: You too. Thank you. Thank you for joining us for this conversation today. If you liked what you heard, be sure to rate and review us on Apple Podcasts and Spotify to help more people discover the show. Really appreciate you joining us today and be sure to hit subscribe, leave a comment and come back next week so we can keep exploring what's up next in tech and shape our collective future together.

    [00:53:45] Ariba Jahan: Until then, stay curious.

🙏 LEAVE A REVIEW

If you enjoyed listening to this episode, then please leave us a 5-star review on Apple Podcasts and a 5-star rating on Spotify! Thank you so much for helping us reach more listeners and viewers.

🔗 CONNECT WITH ARIBA JAHAN

Instagram | Linkedin | Twitter/X | Youtube

 

💻 SUBSCRIBE TO THE NEWSLETTER

📱 LET’S CONTINUE THE CONVERSATION 

What did you learn from this episode?

Share it on Linkedin, Twitter or Instagram and tag me (Ariba Jahan). I’d love to know!

You can also send me a DM on instagram @ariba.jahan or email podcast@aribajahan.com

🎙 ABOUT THE PODCAST

Want to go beyond the hype of emerging tech and dig into the realities, creative possibilities and responsible considerations behind innovations like Artificial Intelligence, Augmented Reality and Blockchain?

Join me, Ariba Jahan on Up Next In Tech for conversations with product experts, creative technologists, scientists and industry leaders to unpack what it’s really like to work in nascent fields shaping how we live and interact.

Whether you want tactics for your own journey or just inspiration from those leading the way, join us and let's envision the future we want to build together.

Subscribe: Spotify | Apple Podcast | Amazon Music | YouTube | Overcast | Google Podcast | iHeartRadio

 

More From Up Next In Tech

Previous
Previous

Making AI Digital Friends in Augmented Worlds with Alex Herrity of Anima

Next
Next

Hype vs. Momentum: Decoding Cultural Patterns w/ Matt Klein of Zine