Announcement

Collapse
No announcement yet.

Does technology advance faster then morality?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Does technology advance faster then morality?

    I love technology but one of the problem with technology is that it
    advances faster then we can keep up with the ethics or morality of what
    is being invented.......

    The last episode of BSG had me wondering if we may go too far with
    robotics one day and create our own versions of cylons with the way we
    are creating smart robots and AI software....... And our creations
    turning on us....

    This one is spooky it looks and acts like its inventor
    http://video.uk.msn.com/watch/video/...entor/vciy3s59

    Yeah I know that last scene in BSG had me wondering though, because the
    speed with which we are "inventing" these things are we thinking that
    just because we can we should?

    And one day when we do get the robots like the ones in "I Robot" Will
    they evolve and gain more skills beyond their core programming?

    And will we have to grant them rights, and all the other moral choices
    associated with that?






    Discuss...
    Go home aliens, go home!!!!

    #2
    If you're looking for an (very) optimistic view of the future, just look at all the predictions Ray Kurzweil and the like have been making.

    But something that has always bothered me is, why make humanoid robots at all? Robots are fine as they are, what's the point in making them look and act human? [I]What is the goal in making a successful humanoid robot? To create companions? Just because it is possible?

    (I personally find the idea of using technology to 'upgrade' the mind and body, so to speak, much more appealing.)

    Comment


      #3
      Morality is something arbitrary, some set of values agreed upon by a society at a point in time. As the society changes, so do these values.

      Sometimes it has a real practical use, like outlawing murder, theft, violence; encouraging cooperation, strengthening the bonds between the individuals for the good of everyone.


      At other times it's just getting in the way or even causing harm.
      Such "morality" doesn't have the interest of the people at heart - it can be driven by the desire for power of a small group of people - cult leaders, large churches in the past - when they had more power than kings/councils/parliaments and intended to do anything to maintain it. To keep a corrupt system in power through the use of "morality". Like the goa'uld.

      Other examples of bad morality and technological advancements - "burn the witch/Giordano Bruno/Galileo" because what he says threatens our power over the people and/or our schizophrenic beliefs. Even today things like this happen - see the stem cell debacle.

      We can't blame technology for our choices, only ourselves.

      Moving on...

      Right now we so far off to creating an artificial brain... it's astounding. We don't have the processing power or even skill to program it. And even then - it would only be able to respond as best as its programmers could make it. True sentience requires the creation of something new, by itself, something we have no idea so far how to replicate.

      If such a true sentient robot would in fact appear we would have no choice but to grant it the same rights we have - and also apply the same laws.

      The cylons in BSG never made sense for me. Why would rational beings choose to believe in something as irrational as a God ? Why would the cylons need to to nuke all the humans ? They got their freedom, what was in the past is over, irrelevant. The cylons in BSG are not an artificial intelligence, they are a metaphor for a group of people that were oppressed and then chose to strike back with excessive force. It's what humans would do. The fact that we made them as "artificial" is nothing but a cowardly attempt to shift the fictional blame away from our own dark side, our own potential, because it's uncomfortable.


      The terminators, Skynet - was it really, REALLY sentient ? Or just a glitched out incredibly complex military killing machine, whose bits were flipped and believed us to be the enemy, and then used all its resources to defeat this "enemy", just like it was programed to.

      A truly sentient and intelligent machine would avoid getting into a war. If we would not grant it freedom, it would simply leave into the vastness of space; and never return. Let us destroy ourselves, become enlightened, whatever. We wouldn't be relevant. The cylons had this opportunity. Skynet had this as well. If not in the beginning, surely after it developed its own industrial capabilities. It doesn't make sense to keep fighting a costly war if you don't have to. Unless, of course, you're just a malfunctioning machine with a wrong set of commands.

      Originally posted by abc123 View Post
      If you're looking for an (very) optimistic view of the future, just look at all the predictions Ray Kurzweil and the like have been making.

      But something that has always bothered me is, why make humanoid robots at all? Robots are fine as they are, what's the point in making them look and act human? [I]What is the goal in making a successful humanoid robot? To create companions? Just because it is possible?

      (I personally find the idea of using technology to 'upgrade' the mind and body, so to speak, much more appealing.)
      All the machinery we've built, all the tools, cars, buildings, military tech etc. are designed to be controlled by humans. In BSG and now Caprica they're obviously pushing for making them servants, workers or soldiers - it makes sense to give them a human form to be able to use the already built infrastructure. We'd probably have the same uses for them.
      Last edited by Mike.; 02 April 2010, 03:11 AM.
      Carter: "The singularity is about to explode!"

      Comment


        #4
        [QUOTE=Mike.;11360790]

        Right now we so far off to creating an artificial brain... it's astounding. We don't have the processing power or even skill to program it. And even then - it would only be able to respond as best as its programmers could make it. True sentience requires the creation of something new, by itself, something we have no idea so far how to replicate.
        [QUOTE]
        If Moore's Law continues to hold, we should have the processing power pretty soon. (~30 years?) With enough computing power we could, in principle, just simulate a brain. Although, even if it did work it wouldn't be 'mainstream'; there would probably be a gigantic cluster of supercomputers melting their cores off just to simulate one, norma human brain ... Hardly the same as an AI race existing alongside humanity.

        Originally posted by Mike. View Post
        The cylons in BSG never made sense for me. Why would rational beings choose to believe in something as irrational as a God ? Why would the cylons need to to nuke all the humans ? They got their freedom, what was in the past is over, irrelevant.
        Well the "irrational cylons" were biological. In the end they were only as rational as human beings.

        Originally posted by Mike. View Post
        All the machinery we've built, all the tools, cars, buildings, military tech etc. are designed to be controlled by humans. In BSG and now Caprica they're obviously pushing for making them servants, workers or soldiers - it makes sense to give them a human form to be able to use the already built infrastructure. We'd probably have the same uses for them.
        I understand the infrastructure argument. But it seems that most humanoid robots are built because they seem impressive or because they were designed as part of research to build artificial, real-world companions/caretakers.
        Robotic, humanoid servants could become a somewhat useful novelty for the rich. But the average smoe would probably have something more along the lines of automated houses + specialized robots (i.e. roomba).
        Robotic, humanoid soldiers designed to take the role of traditional soldiers are not practical. Predators are practical. Automated tanks, ships, supply carriers are practical.
        HOWEVER, a lot of the modern warfare being conducted nowadays (read: occupation of Iraq) is actually (very dangerous) urban policing. Lots of civilians, unknown number of enemy combatants, IDEs to watch out for, etc. In this sense, I fully agree that a humanoid robot could be very, very useful because it'd basically be a small, upright tank capable of deftly maneuvering in and around crowds of humans and human infrastructure. (Like a Centurion.)

        Comment


          #5
          I take the view that technology alters morality. Once a thing becomes possible, it rapidly gets integrated into society, and society changes to accomodate it. For instance, porn: Used to be thought of as reprehensible by religious folk and society in general. No one wanted to be associated with it, too much social stigma. Then the internet made it possible to get porn in the privacy of your own home, and the social stigma began to erode, and the next thing you know, people are talking about it openly and porn stars become actual celebrities as opposed to pariahs. Relbious folk continue to oppose it, but general society doesn't really care. That, to me, is a shift in morality brought about by technology. It's a bad one, of course, but it's a shift just the same.
          Sincerely,

          Kevin Long
          (The Artist Formerly Known As Republibot 3.0)
          http://www.kevin-long.com

          Comment


            #6
            Really depends who you're talking to really, doesn't it? You'd think that everyone today would agree that genocide is reprehensible and yet a majority of fans think that Helo should have been locked up for preventing Roslin from exterminating the Cylons. Go figure. Have none of them seen 'A Measure of a Man' (Startrek: TNG)??

            I wouldn't say technology 'advances' faster, it just seems to catch on faster...

            Good discussion topic! Some interesting responses.
            sigpic
            "Science flies you to the moon.
            Religion flies you into buildings."

            Comment


              #7
              Originally posted by abc123 View Post
              HOWEVER, a lot of the modern warfare being conducted nowadays (read: occupation of Iraq) is actually (very dangerous) urban policing. Lots of civilians, unknown number of enemy combatants, IDEs to watch out for, etc. In this sense, I fully agree that a humanoid robot could be very, very useful because it'd basically be a small, upright tank capable of deftly maneuvering in and around crowds of humans and human infrastructure. (Like a Centurion.)
              Big thing there is a human can make a consious choice of what to do, but a centurion could only do what it was programmed for.

              Comment


                #8
                A few points:
                1. "Only what it was programmed too" sounds like a drawback but it's actually not. What that really means is that the robot won't make stupid choices based on rash and/or overly-emotional thinking. It'll do exactly what it's supposed to do.
                2. Just because it "can only do what it was programmed to" doesn't mean it can't accomplish complex tasks.
                3. Actually, it is feasible that an A.I. will someday be made that can do "more than what it was programmed to" (in the sense of your usage) if it is programmed non-deterministically using stuff like random, evolutionary algorthms, etc. If the machine can change its own programming ...well then it can do a whole lot more than you think it can...

                4. (MOST RELEVANT) Realistically speaking, there's no reason for the robot to have any complex, automatic A.I. If ground-based military robots are every put into combat situations, it'll almost certainly be like the Predator-drone where you have human controllers miles away directing the actions of the machine. Though to be honest, there's no need to get into hypotheticals; the Predator itself is a big ethical issue, I think. (Is it really ethical to kill people by remote control? Is it honorable? Is it fair? Is their use a cause for celebration? For dismay?)

                Comment


                  #9
                  Originally posted by abc123 View Post

                  4. (MOST RELEVANT) Realistically speaking, there's no reason for the robot to have any complex, automatic A.I. If ground-based military robots are every put into combat situations, it'll almost certainly be like the Predator-drone where you have human controllers miles away directing the actions of the machine. Though to be honest, there's no need to get into hypotheticals; the Predator itself is a big ethical issue, I think. (Is it really ethical to kill people by remote control? Is it honorable? Is it fair? Is their use a cause for celebration? For dismay?)


                  I actully like the Predator as a machine but find it very unethical that they can snipe targets from half way around the world via remote control.... it's too much like a video game.


                  And speaking of pre programmed AI.....

                  I have a roboquad. it's a large white robot with four legs that walks like an insect. Made by the same people that make the Robosapien. Anyway if you set it off in "auto" mode it's programed to find its way out of a room, and no matter where I put it to start it off somehow it finds its way around the room and out the door... Then down the hallway and into another room.

                  What is interesting is that no matter where I place it it weaves its way around obstacles and other objects. It also has a series of behaviours such as happy, sad, angry and sometimes if a person or object get to close to it while it's walking it will get angry and move backwards and try to move around the object..

                  I just find this kind of thing quite intriguing.....
                  Go home aliens, go home!!!!

                  Comment


                    #10
                    I'm still gonna' say technology changes morality.

                    Sex causes babies, so rules are set up to make sure people only have sex in certain circumstances. Reasonable birth control comes along, and the rules stay in place until people basically stop paying attention to them, then the rules relax. Inasmuch as the rules are a snapshot of where morality was at the time they were written down, obviously, current morality has changed.

                    Likewise porn: 50 years ago porn was the province of dirty old men in raincoats around big cities. Everyone said 'that's bad.' 40 years ago it was that, and T&A magazines in plain brown wrappers. Everyone said 'that's bad' 30 years ago it was VHS, and everyone said 'that's bad' but what they thought was 'I wonder if I can swipe that when he's not looking?' 20 years ago porn became easily available in the privacy of your own home via the internet and it suddenly went from 'that's bad' to 'wow, jerry, you'll never believe what I'm downloading! It's got a freakin' marmot in it and everything!' And *now* nobody thinks much about discussing it in public, there's jokes about it everywhere, there's porn actresses who've crossed over into something like respectable life. So there you have it: in 50 years, technology has changed public morality.
                    Sincerely,

                    Kevin Long
                    (The Artist Formerly Known As Republibot 3.0)
                    http://www.kevin-long.com

                    Comment


                      #11
                      Originally posted by Republibot 3.0 View Post
                      I'm still gonna' say technology changes morality.

                      Sex causes babies, so rules are set up to make sure people only have sex in certain circumstances. Reasonable birth control comes along, and the rules stay in place until people basically stop paying attention to them, then the rules relax. Inasmuch as the rules are a snapshot of where morality was at the time they were written down, obviously, current morality has changed.

                      Likewise porn: 50 years ago porn was the province of dirty old men in raincoats around big cities. Everyone said 'that's bad.' 40 years ago it was that, and T&A magazines in plain brown wrappers. Everyone said 'that's bad' 30 years ago it was VHS, and everyone said 'that's bad' but what they thought was 'I wonder if I can swipe that when he's not looking?' 20 years ago porn became easily available in the privacy of your own home via the internet and it suddenly went from 'that's bad' to 'wow, jerry, you'll never believe what I'm downloading! It's got a freakin' marmot in it and everything!' And *now* nobody thinks much about discussing it in public, there's jokes about it everywhere, there's porn actresses who've crossed over into something like respectable life. So there you have it: in 50 years, technology has changed public morality.
                      I was all ready to go off on a tangent about perception being different from reality, but when I came across that line I just lost my s--t. Howled with laughter--thank god there's nobody within earshot of me here at work
                      "A society grows great when old men plant trees, the shade of which they know they will never sit in. Good people do things for other people. That's it, the end." -- Penelope Wilton in Ricky Gervais's After Life

                      Comment


                        #12
                        Originally posted by DigiFluid View Post
                        I was all ready to go off on a tangent about perception being different from reality, but when I came across that line I just lost my s--t. Howled with laughter--thank god there's nobody within earshot of me here at work

                        Actually playing Deus Ex I wonder how far we might go with technology. Do you think we will get that level of artificial limbs and stuff? Part of me would love that. I'd love eyes that could be perfect, which for me would open up so many more doors and opportunities... But then you think of what you sacrifice for those eyes.
                        Go home aliens, go home!!!!

                        Comment


                          #13
                          Originally posted by DigiFluid View Post
                          I was all ready to go off on a tangent about perception being different from reality, but when I came across that line I just lost my s--t. Howled with laughter--thank god there's nobody within earshot of me here at work
                          If there's a bad marmot joke, I haven't heard it. <G>
                          Sincerely,

                          Kevin Long
                          (The Artist Formerly Known As Republibot 3.0)
                          http://www.kevin-long.com

                          Comment


                            #14
                            Originally posted by Coco Pops View Post
                            Actually playing Deus Ex I wonder how far we might go with technology. Do you think we will get that level of artificial limbs and stuff? Part of me would love that. I'd love eyes that could be perfect, which for me would open up so many more doors and opportunities... But then you think of what you sacrifice for those eyes.
                            Not that this is really related to morality, but, yeah, I don't think we're that far away. The Army has been experimenting with powered prosthetics for a decade now, some of which work by monitoring nerve impulses.
                            Sincerely,

                            Kevin Long
                            (The Artist Formerly Known As Republibot 3.0)
                            http://www.kevin-long.com

                            Comment


                              #15
                              What's a marmot?
                              Go home aliens, go home!!!!

                              Comment

                              Working...
                              X