|
Post by ForeverKuroi on Jun 21, 2022 13:53:25 GMT -5
Yeah but dead people can't sue and the family members of the deceased are typically the ones requesting to chat with them. Now if I'm wanting to chat with DT's great-great grandpa Copytrap, I could see the legal repercussions there. Is the monetary concern from other family?
|
|
|
Post by Mongo the Destroyer on Jun 21, 2022 18:32:58 GMT -5
Yeah but dead people can't sue and the family members of the deceased are typically the ones requesting to chat with them. Now if I'm wanting to chat with DT's great-great grandpa Copytrap, I could see the legal repercussions there. Is the monetary concern from other family? Didn't you and I just recently have a discussion on rights for something that was dead? I feel like your opinion was different that time :thinking: The major concern though relates to two areas it seems -1. Profiting off of somebody who's dead, which is an issue in Hollywood too. Even if the family are requesting, they aren't making money from it so it could be seen as predatory behavior from the company by making a product like that which is specifically targeting grieving people. 2. How it's used. So the company that made the coding which the company who designed the bot used is upset because their code/algorithm isn't supposed to be used in a case like this where romance is part of it. IE, they may have used the code to revive this girl as a sex chat bot. In which case the concern comes up that maybe it's disrespectful to use the code to make chatbots based on real people that the customer could wack off to
|
|
|
Post by ForeverKuroi on Jun 21, 2022 21:14:16 GMT -5
I don't recall the incident you're referencing. Feel free to message me via Discord to refresh my memory.
So #2 is actually pretty potentially a legitimate concern, especially if the person who died was once a sex worker. However, what if a guy's girlfriend died in a car accident and he wanted to rekindle old memories? In that scenario, it may not be the worst thing in the world, albeit a bit weird. But hey, people's got to grieve.
But I don't see 1 as predatory. This isn't like lottery tickets or cigarettes. You're providing a service that others really want. The profit isn't from death, but rather, bringing closure. A lot of #1's logic also applies to people who run cemeteries and burial services.
|
|
|
Post by Mongo the Destroyer on Jun 21, 2022 21:28:34 GMT -5
I don't recall the incident you're referencing. Feel free to message me via Discord to refresh my memory. So #2 is actually pretty potentially a legitimate concern, especially if the person who died was once a sex worker. However, what if a guy's girlfriend died in a car accident and he wanted to rekindle old memories? In that scenario, it may not be the worst thing in the world, albeit a bit weird. But hey, people's got to grieve. But I don't see 1 as predatory. This isn't like lottery tickets or cigarettes. You're providing a service that others really want. The profit isn't from death, but rather, bringing closure. A lot of #1's logic also applies to people who run cemeteries and burial services. You don't think that taking advantage of people who are in a weakened emotional state due to grieving is predatory? If while I was alive I ok'd that sort of process? Fine, maybe for me it's a way to live on. But what if you didn't get consent ahead of time and then used the algorithm to make this chat bot but they're not like the person at all? Maybe their social life online or other artifacts used in the creation don't paint the right picture, or maybe the programmers have introduced too much of themselves into the program? Now you're misrepresenting them and again, "reviving" them without consent. I feel like you're putting a lot of weight on "Well the person grieving really wants it" but not any on "Maybe the person being used wouldn't have wanted that."
|
|
|
Post by ForeverKuroi on Jun 21, 2022 21:57:17 GMT -5
I don't recall the incident you're referencing. Feel free to message me via Discord to refresh my memory. So #2 is actually pretty potentially a legitimate concern, especially if the person who died was once a sex worker. However, what if a guy's girlfriend died in a car accident and he wanted to rekindle old memories? In that scenario, it may not be the worst thing in the world, albeit a bit weird. But hey, people's got to grieve. But I don't see 1 as predatory. This isn't like lottery tickets or cigarettes. You're providing a service that others really want. The profit isn't from death, but rather, bringing closure. A lot of #1's logic also applies to people who run cemeteries and burial services. You don't think that taking advantage of people who are in a weakened emotional state due to grieving is predatory? No more predatory than people selling the biggest, boldest tombstone out there to incentivize people to spend thousands on a stone. At least with the bot, you can interact with it more often. My mom died two years ago and I visited her grave once. It's easy to be predatory during an emotional time. Have you seen how many people spend tens of thousands of dollars on wedding? Or hundreds of bucks on birthdays? I don't think providing a service is inherently predatory. Now it certainly COULD be predatory. If I programmed this bot to 100% of the time reflect the worst of your relationship and upcharge you on good memories? Or even subscription based where I charge you by the month. The first is absolutely predatory and second, quite potentially. I think it'd make it easier to consent before death. I would. If I die tomorrow, I'd want Mrs. Kuroi to have more than stories, videos and pictures to remember me. Again, getting it wrong is a legitimate concern and the only real remedy for that is to make sure the algorithm is up to par. Another one is how people present themselves online. I'm more political on Facebook than I am in real life, but a huge part of science is acknowledging a level of error. Some errors are more acceptable than others. I'd rather my vacuum fail on me than my life vest.
|
|
|
Post by ForeverKuroi on Jun 21, 2022 22:14:27 GMT -5
Out of curiosity, how would this change for your opinion if this was a free service, potentially running off donations instead?
|
|
|
Post by Mongo the Destroyer on Jun 21, 2022 23:00:59 GMT -5
Out of curiosity, how would this change for your opinion if this was a free service, potentially running off donations instead? Also bad, lol. I think consent is needed from the subject
|
|
|
Post by Mongo the Destroyer on Sept 21, 2022 19:25:43 GMT -5
|
|
|
Post by ForeverKuroi on Sept 22, 2022 16:52:48 GMT -5
I won't let it go to my head. I think we're tricked into thinking AI is better than it really is because of: A) Hollywood B) The fact that most people (including me to a lesser degree) don't understand it. C) The results that are really good end up surprising us with how good it is.
To be quite honest, in its everyday usage, we have the world's smartest people collaborating and creating today's best results and Alexa still can't do a lot of things right, and my car's GPS is still really slow and often crap.
|
|
|
Post by jacobcraft on Mar 18, 2023 8:07:04 GMT -5
Not to bring up an old target, but there are drones being used for war. I forget the country, but the drones decide who and when they attack all by themselves.
|
|
|
Post by Mongo the Destroyer on Mar 18, 2023 8:33:05 GMT -5
Not to bring up an old target, but there are drones being used for war. I forget the country, but the drones decide who and when they attack all by themselves. That seems quite reckless for a country to empower a drone like that.
|
|
|
Post by edwarddubin0604 on Mar 18, 2023 20:23:17 GMT -5
Remember Hal from the movie 2001 who was a computer, whom John had to shut down after Hal became a killer. Then there was War Games where Matthew Crawford's computer asked him if "You want to play a game." It turns out he almost started a war. In an old Star Trek episode called "The Ultimate Computer" the creator of a self survival computer program that the Federation ordered installed for test purposes had to force the program to shut down after it destroyed a transport vessel and attacked Federation starships during a test session. The Federation targeted The Enterprise because of that and it took Captain Kirk to call off the attack. Then there was The Borg who were ruled by The Borg Queen, an A.I. who could be replaced when destroyed by either a computer virus or if the Borg Cube was destroyed. Steven Spielberg also produced a movie called "A.I.: Artificial Intelligence" which was a take on Pinocchio since the main character thought he was a real boy. When it comes to drones in war they are programmed to target the enemy positions and destroy those targets. Then there are drones that can deliver packages to people's doors. The thing is AI still needs humans to repair and upgrade them or check them for possible viruses and prevent hackers from breaking into the systems.
|
|
|
Post by jacobcraft on Mar 18, 2023 23:09:43 GMT -5
These don't. They decide all on their own. I'd love to know the programming.
|
|
|
Post by edwarddubin0604 on Mar 21, 2023 18:45:17 GMT -5
So would hackers. Did you see Quantum Leap this past Monday near the end of the episode where Al's daughter revealed that the Ziggy program on the show may have been the one that sabotaged the Quantum Leap program sending the character of Dr. Ben Song through time and inhabiting different bodies. Ziggy may have done the same with Dr. Sam Beckett in the original Quantum Leap series too just to sabotage the program too.
|
|