1. Welcome! Please take a few seconds to create your free account to post threads, make some friends, remove a few ads while surfing and much more. ClutchFans has been bringing fans together to talk Houston Sports since 1996. Join us!

Humanity is DOOMED

Discussion in 'BBS Hangout' started by KingCheetah, Feb 27, 2011.

  1. rimrocker

    rimrocker Member

    Joined:
    Dec 22, 1999
    Messages:
    23,048
    Likes Received:
    9,962
    Yes, yes. I agree with the title of this thread.
     
  2. Ziggy

    Ziggy QUEEN ANON

    Joined:
    Jun 11, 1999
    Messages:
    37,265
    Likes Received:
    13,730
    Harder, daddy
     
    Ubiquitin likes this.
  3. rocketsjudoka

    rocketsjudoka Member

    Joined:
    Jul 24, 2007
    Messages:
    58,167
    Likes Received:
    48,334
    Hentai fans REJOICE!
     
  4. Yung-T

    Yung-T Member

    Joined:
    Apr 16, 2009
    Messages:
    24,403
    Likes Received:
    7,053
    [​IMG]
     
    Ubiquitin and Xerobull like this.
  5. Xerobull

    Xerobull ...and I'm all out of bubblegum
    Supporting Member

    Joined:
    Jun 18, 2003
    Messages:
    36,819
    Likes Received:
    35,666

    First thought too.

    Second thought was the squid robots from The Matrix

    Third thought was cyborg Cthulhus.
     
  6. London'sBurning

    Joined:
    Dec 5, 2002
    Messages:
    7,205
    Likes Received:
    4,817
  7. rocketsjudoka

    rocketsjudoka Member

    Joined:
    Jul 24, 2007
    Messages:
    58,167
    Likes Received:
    48,334
    Why not all of the above? :eek:
     
    Xerobull likes this.
  8. Xerobull

    Xerobull ...and I'm all out of bubblegum
    Supporting Member

    Joined:
    Jun 18, 2003
    Messages:
    36,819
    Likes Received:
    35,666
    New Chip Expands the Possibilities for AI
    An energy-efficient chip called NeuRRAM fixes an old design flaw to run large-scale AI algorithms on smaller devices, reaching the same accuracy as wasteful digital computers.
    [​IMG]
    Señor Salme for Quanta Magazine
    Introduction
    Artificial intelligence algorithms cannot keep growing at their current pace. Algorithms like deep neural networks — which are loosely inspired by the brain, with multiple layers of artificial neurons linked to each other via numerical values called weights — get bigger every year. But these days, hardware improvements are no longer keeping pace with the enormous amount of memory and processing capacity required to run these massive algorithms. Soon, the size of AI algorithms may hit a wall.

    And even if we could keep scaling up hardware to meet the demands of AI, there’s another problem: running them on traditional computers wastes an enormous amount of energy. The high carbon emissions generated from running large AI algorithms is already harmful for the environment, and it will only get worse as the algorithms grow ever more gigantic.

    One solution, called neuromorphic computing, takes inspiration from biological brains to create energy-efficient designs. Unfortunately, while these chips can outpace digital computers in conserving energy, they’ve lacked the computational power needed to run a sizable deep neural network. That’s made them easy for AI researchers to overlook.

    That finally changed in August, when Weier Wan, H.-S. Philip Wong, Gert Cauwenberghs and their colleagues revealed a new neuromorphic chip called NeuRRAM that includes 3 million memory cells and thousands of neurons built into its hardware to run algorithms. It uses a relatively new type of memory called resistive RAM, or RRAM. Unlike previous RRAM chips, NeuRRAM is programmed to operate in an analog fashion to save more energy and space. While digital memory is binary — storing either a 1 or a 0 — analog memory cells in the NeuRRAM chip can each store multiple values along a fully continuous range. That allows the chip to store more information from massive AI algorithms in the same amount of chip space.

    As a result, the new chip can perform as well as digital computers on complex AI tasks like image and speech recognition, and the authors claim it is up to 1,000 times more energy efficient, opening up the possibility for tiny chips to run increasingly complicated algorithms within small devices previously unsuitable for AI like smart watches and phones.

    Researchers not involved in the work have been deeply impressed by the results. “This paper is pretty unique,” said Zhongrui Wang, a longtime RRAM researcher at the University of Hong Kong. “It makes contributions at different levels — at the device level, at the circuit architecture level, and at the algorithm level.”

    Creating New Memories
    In digital computers, the huge amounts of energy wasted while they run AI algorithms is caused by a simple and ubiquitous design flaw that makes every single computation inefficient. Typically, a computer’s memory — which holds the data and numerical values it crunches during computation — is placed on the motherboard away from the processor, where computing takes place.

    For the information coursing through the processor, “it’s kind of like you spend eight hours on the commute, but you do two hours of work,” said Wan, a computer scientist formerly at Stanford University who recently moved to the AI startup Aizip.

    [​IMG]
    The NeuRRAM chip can run computations within its memory, where it stores data not in traditional binary digits, but in an analog spectrum.

    Introduction
    Fixing this problem with new all-in-one chips that put memory and computation in the same place seems straightforward. It’s also closer to how our brains likely process information, since many neuroscientists believe that computation happens within populations of neurons, while memories are formed when the synapses between neurons strengthen or weaken their connections. But creating such devices has proved difficult, since current forms of memory are incompatible with the technology in processors.

    Computer scientists decades ago developed the materials to create new chips that perform computations where memory is stored — a technology known as compute-in-memory. But with traditional digital computers performing so well, these ideas were overlooked for decades.

    “That work, just like most scientific work, was kind of forgotten,” said Wong, a professor at Stanford.

    Indeed, the first such device dates back to at least 1964, when electrical engineers at Stanford discovered they could manipulate certain materials, called metal oxides, to turn their ability to conduct electricity on and off. That’s significant because a material’s ability to switch between two states provides the backbone for traditional memory storage. Typically, in digital memory, a state of high voltage corresponds to a 1, and low voltage to a 0.

    To get an RRAM device to switch states, you apply a voltage across metal electrodes hooked up to two ends of the metal oxide. Normally, metal oxides are insulators, which means they don’t conduct electricity. But with enough voltage, the current builds up, eventually pushing through the material’s weak spots and forging a path to the electrode on the other side. Once the current has broken through, it can flow freely along that path.
    [​IMG]

    Wong likens this process to lightning: When enough charge builds up inside a cloud, it quickly finds a low-resistance path and lightning strikes. But unlike with lightning, whose path disappears, the path through the metal oxide remains, meaning it stays conductive indefinitely. And it’s possible to erase the conductive path by applying another voltage to the material. So researchers can switch an RRAM between two states and use them to store digital memory.

    Midcentury researchers didn’t recognize the potential for energy-efficient computing, nor did they need it yet with the smaller algorithms they were working with. It took until the early 2000s, with the discovery of new metal oxides, for researchers to realize the possibilities.

    Wong, who was working at IBM at the time, recalls that an award–winning colleague working on RRAM admitted he didn’t fully understand the physics involved. “If he doesn’t understand it,” Wong remembers thinking, “maybe I should not try to understand it.”

    But in 2004, researchers at Samsung Electronics announced that they had successfully integrated RRAM memory built on top of a traditional computing chip, suggesting that a compute-in-memory chip might finally be possible. Wong resolved to at least try.
     
    KingCheetah and Yung-T like this.
  9. Xerobull

    Xerobull ...and I'm all out of bubblegum
    Supporting Member

    Joined:
    Jun 18, 2003
    Messages:
    36,819
    Likes Received:
    35,666
    continued....



    Compute-in-Memory Chips for AI

    For more than a decade, researchers like Wong worked to build up RRAM technology to the point where it could reliably handle high-powered computing tasks. Around 2015, computer scientists began to recognize the enormous potential of these energy-efficient devices for large AI algorithms, which were beginning to take off. That year, scientists at the University of California, Santa Barbara showed that RRAM devices could do more than just store memory in a new way. They could execute basic computing tasks themselves — including the vast majority of computations that take place within a neural network’s artificial neurons, which are simple matrix multiplication tasks.

    Melika Payvand, a neuromorphic researcher at the Swiss Federal Institute of Technology Zurich. “I definitely consider it a groundbreaking work.”

    For several years, Wong’s team worked with collaborators to design, manufacture, test, calibrate and run AI algorithms on the NeuRRAM chip. They did consider using other emerging types of memory that can also be used in a compute-in-memory chip, but RRAM had an edge because of its advantages in analog programming, and because it was relatively easy to integrate with traditional computing materials.

    Anup Das, a computer scientist at Drexel University. “This work is the first demonstration.”

    “Digital AI systems are flexible and precise, but orders of magnitude less efficient,” said Cauwenberghs. Now, Cauwenberghs said, their flexible, precise and energy-efficient analog RRAM chip has “bridged the gap for the first time.”

    Scaling Up

    The team’s design keeps the NeuRRAM chip tiny — just the size of a fingernail — while squeezing 3 million RRAM memory devices that can serve as analog processors. And while it can run neural networks at least as well as digital computers do, the chip also (and for the first time) can run algorithms that perform computations in different directions. Their chip can input a voltage to the rows of the RRAM array and read outputs from the columns as is standard for RRAM chips, but it can also do it backward from the columns to the rows, so it can be used in neural networks that operate with data flowing in different directions.

    As with RRAM technology itself, this has long been possible, but no one thought to do it. “Why didn’t we think about this before?” Payvand asked. “In hindsight, I don’t know.”

    “This actually opens up a lot of other opportunities,” said Das. As examples, he mentioned the ability of a simple system to run the enormous algorithms needed for multidimensional physics simulations or self-driving cars.

    Yet size is an issue. The largest neural networks now contain billions of weights, not the millions contained in the new chips. Wong plans to scale up by stacking multiple NeuRRAM chips on top of each other.

    It will be just as important to keep the energy costs low in future devices, or to scale them down even further. One way to get there is by copying the brain even more closely to adopt the communication signal used between real neurons: the electrical spike. It’s a signal fired off from one neuron to another when the difference in the voltage between the inside and outside of the cell reaches a critical threshold.

    “There are big challenges there,” said Tony Kenyon, a nanotechnology researcher at University College London. “But we still might want to move in that direction, because … chances are that you will have greater energy efficiency if you’re using very sparse spikes.” To run algorithms that spike on the current NeuRRAM chip would likely require a totally different architecture, though, Kenyon noted.

    For now, the energy efficiency the team accomplished while running large AI algorithms on the NeuRRAM chip has created new hope that memory technologies may represent the future of computing with AI. Maybe one day we’ll even be able to match the human brain’s 86 billion neurons and the trillions of synapses that connect them without running out of power.
     
    KingCheetah and Yung-T like this.
  10. boomboom

    boomboom I GOT '99 PROBLEMS

    Joined:
    Sep 29, 1999
    Messages:
    12,756
    Likes Received:
    9,402
    tl;dr


    Can AI give me a summary please?
     
    Buck Turgidson likes this.
  11. Xerobull

    Xerobull ...and I'm all out of bubblegum
    Supporting Member

    Joined:
    Jun 18, 2003
    Messages:
    36,819
    Likes Received:
    35,666
    AI chips designed on the way the human brain works. What could go wrong?
     
    boomboom likes this.
  12. boomboom

    boomboom I GOT '99 PROBLEMS

    Joined:
    Sep 29, 1999
    Messages:
    12,756
    Likes Received:
    9,402

    So AI is going to be illogical, sociopathic and depressed? Great! Another person/thing I'll have to fight against to get a therapy appointment.
     
    Yung-T and Xerobull like this.
  13. Rashmon

    Rashmon Member

    Joined:
    Jun 2, 2000
    Messages:
    21,175
    Likes Received:
    18,158
    uh oh...

    Shock wave from sun has opened up a crack in Earth's magnetic field, and it could trigger a geomagnetic storm

    A mysterious shock wave in a gust of solar wind has sent a barrage of high-speed material smashing into Earth’s magnetic field, opening up a crack in the magnetosphere. The barrage of plasma could lead to a geomagnetic storm today (Dec. 19), according to spaceweather.com.

    The shockwave’s origins aren’t exactly known, but scientists think it could have come from a coronal mass ejection launched by the sunspot AR3165, a fizzing region on the sun’s surface that released a flurry of at least eight solar flares on Dec. 14, causing a brief radio blackout over the Atlantic Ocean.

    Sunspots are areas on the sun's surface where powerful
    magnetic fields, created by the flow of electrical charges, knot into kinks before suddenly snapping. The resulting release of energy launches bursts of radiation called solar flares, or plumes of solar material called coronal mass ejections (CMEs). Once launched, CMEs travel at speeds in the millions of miles per hour, sweeping up charged particles from the solar wind to form a giant, combined wavefront that (if pointed toward Earth) can trigger geomagnetic storms.

    Geomagnetic storms occur when energetic solar debris (mostly consisting of electrons, protons and alpha particles) gets absorbed by, and subsequently compresses, Earth’s magnetic field. The solar particles zip through the atmosphere near the poles where Earth's protective magnetic field is weakest and agitate oxygen and nitrogen molecules — causing them to release energy in the form of light to form colorful auroras such as the northern lights.

    The storms can also create cracks in the magnetosphere which remain open for hours at a time, enabling some solar material to stream through and disrupt satellites, radio communications, and power systems.

    Thankfully today's potential storm, predicted to be a G-1 class, will be fairly weak. It may cause minor fluctuations in power grids and impair some satellite functions — including those for mobile devices and GPS systems. It could also cause an aurora to appear as far south as Michigan and Maine.

    More extreme geomagnetic storms, however, can have far more serious effects. They can not only warp our planet's magnetic field powerfully enough to send satellites tumbling to Earth, but can disrupt electrical systems and even cripple the internet.

    The upcoming storm is just the latest in a string of solar attacks fired at Earth as the sun ramps up into the most active phase of its roughly 11-year solar cycle.

    Astronomers have known since 1775 that solar activity rises and falls in cycles, but recently, the sun has been more active than expected, with nearly double the sunspot appearances predicted by the National Oceanic and Atmospheric Administration.

    Scientists anticipate that the sun's activity will steadily climb for the next few years, reaching an overall maximum in 2025 before decreasing again.

    The largest solar storm in recent history was the 1859 Carrington Event, which released roughly the same energy as 10 billion 1-megaton atomic bombs. After slamming into Earth, the powerful stream of solar particles fried telegraph systems around the world and caused auroras brighter than the light of the full moon to appear as far south as the Caribbean.

    If a similar event were to happen today, scientists warn it would cause trillions of dollars’ worth of damage, trigger widespread blackouts, and endanger thousands of lives. A previous solar storm in 1989 released a billion-ton plume of gas that caused a blackout across the entire Canadian province of Quebec, NASA reported.

    But this may not even scratch the surface of what our star is capable of hurling at us. Scientists are also investigating the cause of a series of sudden and colossal spikes in radiation levels recorded in ancient tree rings across Earth's history. A leading theory is that the spikes could have come from solar storms 80 times more powerful than the Carrington Event, but scientists have yet to rule out some other potentially unknown cosmic source.
     
    KingCheetah likes this.
  14. boomboom

    boomboom I GOT '99 PROBLEMS

    Joined:
    Sep 29, 1999
    Messages:
    12,756
    Likes Received:
    9,402
    If it breaks social media, then I'm on the side of the geomagnetic storm. Go Storm Go! Go Storm Go!
     
    Buck Turgidson and Ubiquitin like this.
  15. Buck Turgidson

    Joined:
    Feb 14, 2002
    Messages:
    100,259
    Likes Received:
    102,341
    Before I start prepping for the end, I'll need some confirmation from other than livescience dot com and spaceweather dot com.
     
    Rashmon likes this.
  16. Invisible Fan

    Invisible Fan Member

    Joined:
    Dec 5, 2001
    Messages:
    45,954
    Likes Received:
    28,046
    Stormfront?
    upload_2022-12-21_0-16-30.jpeg
     
    Xerobull and boomboom like this.
  17. Xerobull

    Xerobull ...and I'm all out of bubblegum
    Supporting Member

    Joined:
    Jun 18, 2003
    Messages:
    36,819
    Likes Received:
    35,666
    Invisible Fan likes this.
  18. Xerobull

    Xerobull ...and I'm all out of bubblegum
    Supporting Member

    Joined:
    Jun 18, 2003
    Messages:
    36,819
    Likes Received:
    35,666

    [​IMG]
     
    TimDuncanDonaut likes this.
  19. KingCheetah

    KingCheetah Atomic Playboy
    Supporting Member

    Joined:
    Jun 3, 2002
    Messages:
    59,079
    Likes Received:
    52,746
  20. CrazyJoeDavola

    Joined:
    Apr 30, 2003
    Messages:
    2,327
    Likes Received:
    3,082
    This robot out Mick Jaggered Mick Jagger

     

Share This Page