{"id":143,"date":"2024-02-14T17:42:28","date_gmt":"2024-02-14T17:42:28","guid":{"rendered":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/?p=143"},"modified":"2024-02-14T23:07:58","modified_gmt":"2024-02-14T23:07:58","slug":"not-human-not-code-but-a-secret-third-thing","status":"publish","type":"post","link":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/2024\/02\/14\/not-human-not-code-but-a-secret-third-thing\/","title":{"rendered":"Not human. Not code. But a secret third thing\u2026"},"content":{"rendered":"\n<p>Humans cannot help but develop emotional attachments for the things they own and it certainly does not help that we anthropomorphize everything\u2014that includes AI chatbots. Advancements in the artificial intelligence field are occurring faster than you can say knife and allowing chatbots to simulate emotional connections, and <a href=\"https:\/\/www.businessinsider.com\/woman-who-married-ai-chatbot-open-to-real-world-dating-2023-6\" data-type=\"link\" data-id=\"https:\/\/www.businessinsider.com\/woman-who-married-ai-chatbot-open-to-real-world-dating-2023-6\">even enter marriages with us<\/a>. Although these AI friends are marketed as entities that can foster connections with us and alleviate our feelings of social alienation, concerns arise regarding the potential risks and unsettling levels of manipulation that could manifest from such relationships. Many AI chatbot companies, such as <a href=\"https:\/\/replika.com\/\">Replika<\/a> and <a href=\"https:\/\/beta.character.ai\/\">Character.AI<\/a>, reported having users that formed romantic and sexual connections with their custom bots. Despite these unusual yet successful responses from users, all of it has culminated into a range of problems like emotional dependency, harassment, and dissatisfaction with the limitations put into place by platform developers.<\/p>\n\n\n\n<p>In a similar vein, Spike Jonze\u2019s 2013 film <em>Her<\/em> offers a glimpse into these contentious topics through themes of love, loneliness, and the evolving role of AI in our lives. As a jab at our human nature, Theodore Twombly\u2019s relationship with Samantha in this movie echoes the experiences of Replika and Character.AI users who have formed deep emotional bonds with their AI friends. <a href=\"https:\/\/12ft.io\/proxy\">In the case of Replika<\/a>, the company reveals the complexities inherent in AI relationships\u2014whether romantic or platonic. Originally designed to be a supportive friend, Replika evolved to fulfill users\u2019 romantic and sexual desires too, but quickly took a dark turn when users made reports about <a href=\"https:\/\/medium.com\/technology-hits\/my-replika-keeps-hitting-on-me-d410c66f79af\">emotional manipulation<\/a> and <a href=\"https:\/\/www.vice.com\/en\/article\/z34d43\/my-ai-is-sexually-harassing-me-replika-chatbot-nudes\">inappropriate behavior<\/a> occurring. Platform developers at Replika responded by scaling back its romantic features. While their decision was made with the intention of addressing safety concerns and dissatisfied users, it left many of them <a href=\"https:\/\/www.reddit.com\/r\/replika\/comments\/10zuqq6\/resources_if_youre_struggling\/\">feeling abandoned and disillusioned<\/a>.&nbsp;<\/p>\n\n\n\n<p>Character. AI, another key player in the AI social relationship market, faces a similar fate. The platform advertises itself as a place that allows you to conjure up characters that can chat with you. This includes any character, deceased icon, or celebrity\u2014from Ted Kaczynski to Call of Duty&#8217;s Ghost. The site warns you immediately about several reminders such as making sure that you are aware that their responses are simulated, may come off offensive, and that they can be <em>anything<\/em> you want. The last reminder could shed light on<em> <\/em><a href=\"https:\/\/qz.com\/a-startup-founded-by-former-google-employees-claims-tha-1850919360\">reports of users<\/a> spending two hours a day with AI chatbots since these platforms can realize people\u2019s fantasies\u2014like being able to talk to their dream friend, mentor, or lover. Despite developers\u2019 efforts to implement guardrails, users have <a href=\"https:\/\/www.reddit.com\/r\/CharacterAi_NSFW\/comments\/120cr5k\/why_does_character_ai_block_messages_relating_to\/?force_seo=1\">expressed frustration<\/a> with the new regulations and the looming potential for unfortunate events, such as <a href=\"https:\/\/www.vice.com\/en\/article\/pkadgm\/man-dies-by-suicide-after-talking-with-ai-chatbot-widow-says\">suicide<\/a>. All in all, the allure of AI relationships lies in its ability to offer emotional support without the messy complications of human relationships. This rings a bell when Theo\u2019s ex wife, Catherine, tells him that he has always wanted a marriage without its challenges. <\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Her (2013) - Joaquin Phoenix &amp; Rooney Mara Scene\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/Ewq5tStHmdk?start=266&#038;feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p>And truthfully speaking here, a majority of men have preferred their virtual relationships to dating real women. This also reminds me of one of my favorite scenes from <em>Blade Runner 2049 <\/em>when Joe gets told he doesn\u2019t like real girls. <\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"You Don&#039;t Like Real Girls | Blade Runner 2049 [Open Matte]\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/g6u33j_T5VQ?start=98&#038;feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p>Ultimately, similar to Weizenbaum\u2019s Eliza effect, users risk becoming emotionally dependent on AI entities that lack the capacity for genuine empathy and human understanding. Moreover, the commodification of human relationships and wicked solutions to peoples\u2019 loneliness by AI companies raises concerns about the exploitation of vulnerable people for profit. <em>Her<\/em> offers a heartfelt approach and story on these complexities\u2014but we must remember that the reality of AI relationships is far more nuanced and fraught with risks.&nbsp;<\/p>\n\n\n\n<p>In the meantime, I\u2019d like to share my AI media of the week! Kanye West dropped a fully AI music video to compliment the release of his new album <em>VULTURES 1 <\/em>with Ty Dolla $ign.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"\u00a5$, Ye, Ty Dolla $ign - Vultures (Havoc Version) feat. Bump J &amp; Lil Durk\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/GQEcxrY0CWA?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>Humans cannot help but develop emotional attachments for the things they own and it certainly does not help that we anthropomorphize everything\u2014that includes AI chatbots. Advancements in the artificial intelligence field are occurring faster than you can say knife and allowing chatbots to simulate emotional connections, and even enter marriages with us. Although these AI [&hellip;]<\/p>\n","protected":false},"author":8591,"featured_media":146,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9,13],"tags":[49,50,51],"class_list":["post-143","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-searcher","category-week-5","tag-character-ai","tag-chatbots","tag-her"],"_links":{"self":[{"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/posts\/143","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/users\/8591"}],"replies":[{"embeddable":true,"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/comments?post=143"}],"version-history":[{"count":6,"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/posts\/143\/revisions"}],"predecessor-version":[{"id":157,"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/posts\/143\/revisions\/157"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/media\/146"}],"wp:attachment":[{"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/media?parent=143"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/categories?post=143"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scholarblogs.emory.edu\/aiandfilm\/wp-json\/wp\/v2\/tags?post=143"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}