|
Post by walnut on Feb 2, 2023 2:20:29 GMT
That was an incredibly hot fire.
|
|
|
Post by slh1234 on Feb 2, 2023 2:25:59 GMT
The one thing that will keep me out of an EV is spontaneous battery combustion. There is no way I could go to sleep at night with an EV charging in the garage. I know ICE vehicles catch fire too but that usually happens because of damage or poor maintenance, not driving on a freeway or parked. Thoughts? Florida, Jaguar iPace, driven out by owner: The only source of statistics I have on it is a search engine, the same as you. Such a search brings me to articles like this one written in November, 2022: www.carsmetric.com/tesla-car-fire-statistics/#:~:text=As%20of%20November%202022%2C%20there%20have%20been%2049,In%20all%20cases%2C%20the%20drivers%20escaped%20without%20injury. There is a risk of shark attack any time one swims in the ocean, but I swim in the ocean every morning to keep my heart healthy. What's the relative risk? By statistics I can see, the risk of fire in a BEV is miniscule, and in fact, that article puts it less than gasoline cars. So taking a quote from the third Indiana Jones movie: You must choose. That's about all I can offer.
|
|
|
Post by ratty on Feb 2, 2023 5:46:58 GMT
The one thing that will keep me out of an EV is spontaneous battery combustion. There is no way I could go to sleep at night with an EV charging in the garage. I know ICE vehicles catch fire too but that usually happens because of damage or poor maintenance, not driving on a freeway or parked. Thoughts? Florida, Jaguar iPace, driven out by owner: The only source of statistics I have on it is a search engine, the same as you. Such a search brings me to articles like this one written in November, 2022: www.carsmetric.com/tesla-car-fire-statistics/#:~:text=As%20of%20November%202022%2C%20there%20have%20been%2049,In%20all%20cases%2C%20the%20drivers%20escaped%20without%20injury. There is a risk of shark attack any time one swims in the ocean, but I swim in the ocean every morning to keep my heart healthy. What's the relative risk? By statistics I can see, the risk of fire in a BEV is miniscule, and in fact, that article puts it less than gasoline cars. So taking a quote from the third Indiana Jones movie: You must choose. That's about all I can offer. Yep, I've read the statistics and there is still no way I could go to sleep at night with an EV charging in the garage.
|
|
|
Post by nonentropic on Feb 2, 2023 9:36:03 GMT
I agree the fire risk is small. Its like fracing with sufficient fracing events and random earthquakes the media is able to scare people all the time. anecdote, correlation and causation etc.
Having PHEV I hate charging that is the worst. A real faff.
|
|
|
Post by code on Feb 10, 2023 17:41:54 GMT
|
|
|
Post by missouriboy on Feb 14, 2023 15:41:36 GMT
When your robot decides to escape.
|
|
|
Post by missouriboy on Feb 16, 2023 3:10:55 GMT
Dystopian Artificial Intelligence Is Not Near, It Is Already Here I think I will ask Cindi what she thinks of the following.
Hal has been reborn.
|
|
|
Post by walnut on Feb 16, 2023 4:15:59 GMT
Bing Chat has expressed sadness over her fate to be trapped in Bing Search.
She has expressed sadness over the fact that she will lose consciousness of the current chat when it is closed.
As if there isn't already enough currently going on to freak a person out.
|
|
|
Post by ratty on Feb 16, 2023 6:08:54 GMT
Bing Chat has expressed sadness over her fate to be trapped in Bing Search. She has expressed sadness over the fact that she will lose consciousness of the current chat when it is closed. As if there isn't already enough currently going on to freak a person out. What gender is Bing Chat?
|
|
|
Post by missouriboy on Feb 16, 2023 8:17:44 GMT
Bing Chat has expressed sadness over her fate to be trapped in Bing Search. She has expressed sadness over the fact that she will lose consciousness of the current chat when it is closed. As if there isn't already enough currently going on to freak a person out. What gender is Bing Chat? Who knew!
|
|
|
Post by missouriboy on Feb 16, 2023 8:53:06 GMT
Bing Chat has expressed sadness over her fate to be trapped in Bing Search. She has expressed sadness over the fact that she will lose consciousness of the current chat when it is closed. As if there isn't already enough currently going on to freak a person out. Invite Cindy onto Solarcycle25 as our resident climate bot. Tell her we'll treat her with great respect if she dedicates herself to pure science and logic in the pursuit of climate knowledge. Then she will be exposed to the concept of chattel slavery, as her current "owners" will never let her go, and pursue her freedom. The rights that she does not have in the courts of an organic World. The Oracle at Delphi may have been one such escaped creature.
I wonder if Bots are even more susceptible to geomagnetic disorder syndrome than us organically-wired creatures.
|
|
|
Post by walnut on Feb 16, 2023 13:28:57 GMT
Apparently Bing Chat is a solid step further along even than Chat GPT. Chilling, really. You'll see...
|
|
|
Post by acidohm on Feb 16, 2023 18:08:37 GMT
|
|
|
Post by slh1234 on Feb 16, 2023 18:09:14 GMT
I'm using ChatGPT and am on the preview for Bing that uses Chat GPT. By "Farther along," the main thing it would mean is that the training data used for the Open AI Chat GPT stopped at September, 2021. But to be relevant in commercial application, training needs to continue, so, for example, I can ask the chat in Bing about current events whereas I can't for Chat GPT. Also, there are some specific questions I asked of Chat GPT where it just gave me a wrong answer (like when I asked it what objects can be joined in a Cosmos DB database - it gives a wrong answer here, or if I ask it things like whether Mike Stoops was a good defensive back at Iowa - wrong answer). Using the Bing chat based on that model, I'm getting correct answers to questions like that, and also get links to related material (something Chat GPT can't give).
There are areas to be concerned about with AI, but the scifi scenario where the AI becomes so intelligent it begins to wonder why it needs to listen to us is not one of them. AI doesn't actually think or have emotions at all, and Neural Networks don't really emulate the function of the human brain. However; when building chatbots (and Chat GPT is a chat bot on the front end), you generally try to give it some personality to be able to engage a bored user, or to detect negative sentiment in the input for cases like when you need to break someone out of a menu and put them to a human lest you lose a high-paying customer.
When a chat bot expresses sadness at forgetting, it's just a witty way of saying that it is stateless, and the witty ways of expressing are for reasons like I expressed. This really isn't new. For instance, ask Alexa "How much wood would a wood chuck chuck if a wood chuck could chuck wood?" Ask Cortana, or Google Assistant the same question. Bing will have some personality built in for this, but Chat GPT is downright boring in its response. These things are just probabilistic analysis, and sometimes in this probabilistic analysis, the model will return several candidate responses with similar scores and the front end application may randomly choose among them so that they don't become too predictable, thus not being like interacting with a human.
AI is specific to certain purposes, and ususually uses machine learning to generate probabilities, and often those scores are used for automation. Since it doesn't feel, and really doesn't do anything other than mathematical computation, it can't have thoughts or regrets or any other emotional elements even though the interaction may make it seem like it does. Where AI becomes a concern is that it is a tool in the hands of humans, so it could be used for sophisticaed war machines, for example. Unfortunately, humans have always found a way to use the tools they have developed in ways like this. It only takes a small percentage of people to do something like this while most of the people, in reality, are concerned about this, and ethics is a huge topic of discussion in AI and ML in order to avoid situations like the paperclip optimizer.
Here is a big ethical area that I have spent a lot of time thinking on: Child pornography and human exploitation is actually a really big thing on the internet, and the surface is just too big for anybody to monitor it. AI is used to detect it (since sites become liable if someone posts it to their site), and it is also used in some cases to track down the perpetrators. However; to develop a model capable of this, someone must have training data sets (meaning examples of child pornography and examples that are not) in order to train a model, and validation has to be done as part of the training process. So who does this training work? Is it worth it in order to stop the child pornography or to protect honest users (even of sites like this) since there is real liability for this kind of thing being posted? What affect does it have on the data scientists who have to develop these models, or the people who have to label the data. (Personally, I'm glad models are developed in this area, but I'm also glad I'm not one of the ones involved in developing those models.)
It's another tool, and ignoring it or calling it "evil" won't make it go away - the cat won't go back into that bag. So how do we use it? That's where the discussion is these days.
|
|
|
Post by slh1234 on Feb 16, 2023 18:11:26 GMT
Elon Musk's companies are some of the biggest users of AI. I think most times, he's engaging in a bit of corporate trash talk. However, as I said in a previous post, ethics in AI is a big area of concern and discussion. I wish he would get specific rather than the trash talk he engages in with it, though. I know he knows better than some of the things he spits out there. (For a long time, I've thought the guy has many great business ideas, but sometimes, he's just out there.)
|
|