Steve Wozniak: Human Over Technology: Page 2 of 5

Maintaining a creed of human over technology has served Apple and its co-founder Steve Wozniak well over the last four decades. In this Q&A, Wozniak discusses how making technology work for people brought about the Apple II and shares his thoughts on arti

and the mechanical engineering aspects, properties of materials, and electronics and programming. It’s hard to say, but I think I would have been on a course toward robotics.

In this modern age, if I were bright enough, I’d probably be headed toward artificial intelligence to make [a robot] do a lot of useful things on its own, like sit in the driveway and wash my car on its own, one-square centimeter at a time, all night long, while I sleep.

 

DN: You just mentioned DIY (do-it-yourself) tech. How do you feel about the open-source movement?

Wozniak: I would have to guess that I would still be very into not only civil liberties but sharing information to increase on what others have done. I think I’d be very much into the open-source world. Absolutely, I mean, do what you do. I was young then but [if it were today], I think I’d still like to show engineering prowess and encourage other people to follow step and be able to build on what you’ve done.

 

Atlantic Design & Manufacturing, New York, 3D Printing, Additive Manufacturing, IoT, IIoT, cyber security, smart manufacturing, smart factorySteve Wozniak will take the stage on June 13, in New York, during Atlantic Design & Manufacturing , the East Coast's largest advanced design and manufacturing event. The engineer and cult icon will discuss a range of topics that span his experience at Apple, as well as today's leading tech trends such as robotics, IoT, and wearables, among others. Register for the event here !

DN: You also mentioned AI. Do you have any ethical concerns about the way artificial intelligence is developing now?

Wozniak: Ethical, to me, means truthfulness. The fact that you might be building something that changes mankind – it’s hard to say if it’s a detriment to mankind, we may end up being second on the order of species to machines – I don’t think that ethics applies. We don’t go into these things thinking, ‘oh, I’m going to do something very bad so it gives me power’ or something like that.

You can’t really stop progress. Learning, science, being able to make things that never existed before—You can never stop that. Those things can turn out to have bad aspects. Study the atom and you get the atomic bomb. Learn how to build machines that can make clothing and you could have a lot of people out of work and people have to do other things.

There’s sort of a fear with artificial intelligence that machines could become so intelligence and versatile that they could totally replace a person so there wouldn’t be other jobs to go to, but that is so far off it’s an unrealistic fear at this stage. It would take decades and decades.

We have machines that can learn to play a game faster and better than a human. For 200 years, we’ve had those machines that can make clothing better than a human. It seems like they are thinking better and faster than us, but we told them what to think about, what to work on, what to learn and the method to learn it by—and then

Add new comment

By submitting this form, you accept the Mollom privacy policy.