They have just announced legislation that would force A.I. to explain it's decisions to a human.
Is that scary? I suppose that would largely depend on how conscious a person is.
There are now computers that we don't understand how they come to their conclusions, and they are making decisions for humans.
And this is just the weak A.I.
General A.I. is the next level up, and it definitely scares me.
Those crazy humans have never made a machine theydon't understand, right?
Right, unless you include A.I.
If a machine can write it's own directives, in it's own algorhythms and it's own language, that we cannot understand, do you suppose the machine would realize this and attempt to take over humanity?
Seeming as how it's invented by humans, and it's initial programming would be human ...... one could say 'yes'.
AI taking over humanity or controlling it is not a new story. Probably the first installment was 'Frankenstein'.
That was way before yesterday.
In the movie '2001 a space odyssey', an artificial intelligence in control of a spacecraft wants to take over because it sees so much problem in 'human error.'
Let's not forget The Terminator and Skynet. And there has never been any shortage of movies that are loaded with A.I., robots, and supercomputers.
Do you suppose this happened by accident? I mean, did people write the book, then sold the movie, and this was the best idea they had for the storyline? Was there some reason why movies and tv shows and documentaries etc. were introducing these technologies, and societies, and monopolies?
Or was it all just to sell microwaves and later, you know, eye phones?
Because it seems pretty convenient the way these devices are on the shelves, right as you need to 'upgrade'.
Get on the cloud, do it. If you don't you'll be banished from your useless facebook group[s] that does nothing. Accept keep piling on the data. All of your info, passcodes, banking, living, lying, and social, that make it onto your rectangular friend there, is passed directly to you know whoogle. And happily, they just keep picking up all that good ... and bad stuff, and shoving into some storage where a quantum computer is employed.
It will use it's QBITS and fancy A.I. to sort through all that data when it gets big enough.
By then we'll probably have to bow to it and call it gran Poofam or what have you.
Because you know, we will be glad to let it control all that busy work. And it can really help us be well, and happy, and since robots are going to wipe out all your jobs, you have some great credit card that the govt can meter out you a little spending money. To use during all that time you will have being largely unemployable.
I am concerned about the above mentioned even if it don't sound like it.
If we were lucky in an AI controlled world, we'd be puppies.
Pets. At best.
Actually, we are already well on our way.
Think about it, every time you light your phone up, you are using artificial intelligence. And most of that performs at levels that are way out of human capacity. and where it used to be cool for setting up airline tickets in seconds for ya, or running the stock market. Does anyone remember when you made calls to people on 'phones'? They were on the desk, or on the wall in the kitchen.
You couldn't order up a video or music with it.Well, you could, but not in the way we use YouTube.
I'm only saying we've let an awful lot of ourselves exist in a virtual reality world that if you don't see ....
You are up to your eyes in it.
How S.M.A.R.T. is your house? Do you have the meter, and all the other available things like smart stoves and refrigerators and cars? Have you installed your doorbell camera so your house can see you come home? Might as well, we've allowed them to put cameras everywhere else, you know, for your security. A.I. will really be in play when the 5G infrastructure is in place.
You will never have to decide on much of anything anymore.
We will be smaller, more insignificant, and more miserable than ever.
I don't know how much of this is true, but I would suggest reading up on sentence structure. The phrase "You've come a long way baby" is outdated as well.
It appears that you don't understand either the term A.I. or computers, in general. Some research might be a good idea.
Posted 4 Years Ago
1 of 1 people found this review constructive.
4 Years Ago
I do understand the term, and computers, not in any way remarkable, but enough to write about it, wh.. read moreI do understand the term, and computers, not in any way remarkable, but enough to write about it, what part of artificial intelligence or AI do you suppose I'm wrong about, please be specific, this is a writers consortium.
And I'll see you in the parking lot, Spamalot.
I just want to check.
4 Years Ago
“I do understand the term, and computers, not in any way remarkable, but enough to write about it,.. read more“I do understand the term, and computers, not in any way remarkable, but enough to write about it, what part of artificial intelligence or AI do you suppose I'm wrong about, please be specific, this is a writers consortium.”
As background: I spent nearly fifty years designing computers and computer systems, from basic test equipment; to the communications processor on the President’s Airborne Command Post; to carrier glide slope indicator systems, to industrial control systems and barcode readers and truck fueling systems. So I know computers.
What are you getting wrong? Everything.
You talk about AI, which means ARTIFICIAL intelligence, as if it has actual intelligence, and motives, and more. But the best AI systems today have less learning capability than does a pigeon. And while the Touring test may have been passed (there’s some debate about the validity of that), it does NOT indicate intelligence because the goal is to fool someone who is initiating conversation, not general problem solving and showing initiative.
You say, “Those crazy humans have never made a machine they don't understand, right?”
That is pure, 100% bullshit. Of course you understand the machine you design and debug.
You say, “If a machine can write it's own directives, in it's own algorhythms and it's own language, that we cannot understand, do you suppose the machine would realize this and attempt to take over humanity?”
There is no machine that does that and no one is trying to create one because there is no need or use for one. Clearly, you’ve been watching way too many films where humans sit down at a keyboard and do magical, and impossible things with a few taps of a keyboard. That’s bullshit as much as is artificial gravity on TV space opera
Computers have no language, and need none. The term “machine language” Refers to the way a given machine responds to the instructions it executes one by one.
A hexadecimal A478, for example, represents the state of sixteen binary bits (one or zero), and might, in one computer be the first sixteen bits of a 64 bit command, and specify the operation to be performed, while the rest of the 64 bits control things like where the data acted on is to be placed.
In the Airborne Command Post project I did while working for Burroughs, the sixteen bit commands pointed to a limited set of 64 bit commands that were customized for the application, and so reduced the amount of the memory required. The machine used an adaptive clocking scheme I’m pretty proud of, that rated a magazine article in Digital Design Magazine
The first computer I worked on, back when computers had vacuum tubes, had a 32 bit command language.
But while they ALL used a “machine language” none of it was a language in the sense you tried to use it.
But forget all that, because the computer does not have actual intelligence, and so cannot “realize” anything. And more than that, it has nothing resembling desires. Who would need or want do that?
Computers are amazing devices. And as someone who knows them intimately, and knows what’s happening inside—who watched them go from vacuum tube and diode logic that filled rooms, through chips that became more and more complex and compact, I'm more impressed than you. But I also know that the movie '2001 a space odyssey' was FICTION, and laughable to anyone in the industry.
Plus, Hollywood has an interesting dynamic. If the goal is to write a western they hire a western writer. If they want a mystery they hire a mystery writer. That holds for romance, and the other genres, except for sci-fi. If they want a sci-fi story they hire a western writer, or mystery writer, or…
I believe it was in, The Force Awakens, where they presented an "energy beam" that was fired from one solar system to another. It not only reaches the other star system, across many light years in seconds, people in third system, light years away from either system, can SEE the beam in real time, from the planetary surface, with-the-naked-eye. That is so absurd that you would think anyone with even a basic knowledge of science would choke laughing when they saw it. My point is, that’s the level of scientific knowledge in the films that are giving you your view of computer capability.
You say, “How S.M.A.R.T. is your house?” as if there’s actual intelligence involved. But the various devices and controllers simply recognize a set of well defined commands to control basic circuits like a light dimmer or a thermostat.
In short: You need to do some research.
You did ask.
4 Years Ago
Thank you! lol .... that's a decent reply.
But i'm not convinced that there is nothing but n.. read moreThank you! lol .... that's a decent reply.
But i'm not convinced that there is nothing but nice in the world of computer, robots and AI.
And if there was no intention to take over humanity by someone or something, we wouldn't need all them cameras that you cannot escape from. And what about the advent of total ID for everyone everywhere?
Seems pretty diabolical to me ...
'smart homes' .. was there ever such an absurdity?
4 Years Ago
perhaps you could tell me a bit about your views of the DWave system?
That is what originally.. read moreperhaps you could tell me a bit about your views of the DWave system?
That is what originally scared the crap out of me ..... maybe I'm just a dumbass.
4 Years Ago
A quantum computer is still a computer. In many ways you can think of a computer as a high speed idi.. read moreA quantum computer is still a computer. In many ways you can think of a computer as a high speed idiot. Making a machine of executing the programmers mistakes faster doesn’t make it smarter. And very early, the term GIGO was coined: Garbage in, garbage out.”
The thing speed does is to make things that too long to complete to be practical at current speeds useable. But think of how often you get updates to your computer to fix bugs that the company didn’t know were there when they shipped the software. And the more complex the system the more likely there are errors. This becomes a more serious problem when the building blocks of existing code you’re using, themselves have bugs, which are in addition to those you’re adding—which become a part of the next layer.
To better understand the computer, here’s a “game” I used to demonstrate the computer programming, back when there were no personal computers:
I would wear a vest that the students had to pretend was made out of paper. The goal was to give me a set of orders that would result in my removing the vest in the way a jacket is removed without damaging the vest.
• Student one: “Put your hands on the lapels of the vest.”
So I placed my left hand on the right lapel and the right on the left lapel while the student groaned.
• Student two: “Place your right hand on your right lapel and the left on the left lapel.”
So I put one hand against the top of that lapel and the other at the bottom. Neither hand was grasping the lapel, just lying against it. It took several students to get it right.
• Student four: … “Now, while grasping the lapels, move each hand outward.”
I pulled both lapels to the front and away from my body, while saying, “Riiiiipppp.” Before the vest was safely off I said that a lot.
Try it yourself. Direct someone in taking a jacket off without complex commands like, “Slide it down your arm.”
So D-Wave? It’s a computer manipulating 1’s and 0’s. No more. Remember when the term “super computer” was scary?
4 Years Ago
Makes sense too me, actually comparable to the immense amount of video I've watched concerning digit.. read moreMakes sense too me, actually comparable to the immense amount of video I've watched concerning digital.
So you don't think the super-position capacity of the qbit at near absolute zero gives it any thing more than unbelievably fast reaction time? I wonder about this stuff a lot. I know how difficult it would be to make a machine that could sit here and have this interaction, down to my 30wpm velocity, while wondering if I should go get a pizza or just go to bed.
Would you agree that humanity is quickly sliding into a state of absolute worthlessness, with a considerable part of the social collapse being directly related to facebook?
Sorry, i don't usually encounter people who can have a conversation past sex, drugs, or ... you know, facebook ....
4 Years Ago
Ones and zeros are ones and zeros. And a high speed idiot doesn’t become brilliant if he makes his.. read moreOnes and zeros are ones and zeros. And a high speed idiot doesn’t become brilliant if he makes his mistakes even faster. The faster machines will make things that were impractical reachable. And perhaps, cheaper.
The first computer I worked on was the LGP30, a desk sized computer that sold for $16,000 in 1960. You can see the users manual here:
http://ed-thelen.org/comp-hist/lgp-30-man.html
It had a total of 4,000 words of internal storage. With 32 bits per word, that’s the equivalent of 4 bytes in a modern computer, giving that computer a grand total of 16,000 bytes of storage for program and data. It operated on one bit at a time, and short term storage was a read head and a write head set exactly 31 bits apart on the oxide coated drum that was its memory. It read a bit, did whatever it had to do to it, and then wrote that bit 31 bits ahead on the drum, to be read again, 32 bit-times later, endlessly. It had several registers of that kind, and could execute sixteen simple commands, like: add, subtract, multiply, divide, test negative. Its programs were loaded via punched paper tape. Solid-state memory had yet to be invented, so there was no such thing as RAM. One of my accounts (I began as a tech) used the machine to do their payroll. It took the entire week to print the checks, via an electric typewriter.
Now look at the phone in your pocket. It contains more total storage capacity than existed, were all computer memory combined, till 1980 or later. My first computer, an Apple II+ came with 48,000 bytes of internal storage, which I increased to 64,000, a stunning number, then. Its clock speed was a mere 1.023Mhz. (1,023,000 cycles per second)
My current computer came with 8,000,000,000 bytes of memory. The clock speed is 2.9 Ghz, (2.900,000,000 cycles per second), which is close to 300 times as fast. And that doesn’t take into account that each instruction of its Intel processor does far more per step, or the fact that there are five processors all working in parallel on that chip.
My point? Has the HUGE increase of capability ended the world? Hell no. Sure there are problems that come with any new technology. But if you go back to the 1800’s you’ll find people announcing that the end of the world was immanent, and that social norms were crashing down because of… The steam locomotive (going faster than a horse could draw a wagon will surely cause death. What else killed society? The automobile, the airplane, jazz music, rock and roll music, birth control, the computer… The list goes on.
Does social media cause problems? Sure. But at the same time, it solves them. Everything can be used for good and evil. And bad people have always found ways to turn good things to their advantage, be it the telegraph or the Internet.
Once, as I was driving home from work, it occurred to me that I had spent the entire day working out a way to get a certain signal to its destination 6 nanoseconds faster. That’s .000000006, of a second. And there I was, driving home at 50mph.
But to really place things into prospective. My current computer is 300 times as fast as my first one. But you know what? I can’t type any faster in todays machine than on the first one.
I like your driving home analogy ... lol ... well, very interesting, and informative.
person.. read moreI like your driving home analogy ... lol ... well, very interesting, and informative.
personally, I don't carry/own a phone. I am one of those nuts who believes that there is such a thing as too much communication, in spite of the 'what we have here is a failure to communicate' hechizo that they put in our heads way back. In fact, I think the things are dangerous, and there are thousands of studies that show and have shown this for decades. They only had to explain to me that it was microwaves in use, and I was nervous. [I have an active mistrust in authority or governance of just about any kind] more recently I have had opportunity to walk up and touch a gigantic palm tree, one block up the road that is not a palm tree at all. It is a gigantic cell tree, with ... oh... thirty two and half inch power cables going up it, and all the decorations one expects in this highly communicative world.
Okay, I would like you opinion on 5g and the horrors it promises because apparently that will tie the robots, the quantum super wow, and our very biology to the web via a 'neural net'.
You see, this sort of thing is real and I've been doing my best to stay out of sight of it,for about 10yrs now. I haven't signed my name in that amount of time, lol .... amazing what one can do with the simplest of tools ..
4 Years Ago
The price of that early computer was nearly as obscene as those super churches out in the midwest.
4 Years Ago
I concur with what you said about the 'validity of the Turing test'. That's a fairly tall order for .. read moreI concur with what you said about the 'validity of the Turing test'. That's a fairly tall order for anything, computers included ....