hate these ads?, log in or register to hide them
Page 1 of 177 12341151101 ... LastLast
Results 1 to 20 of 3536

Thread: The Automation Spiral (obligatory loleconomics thread v2)

  1. #1
    Donor Pattern's Avatar
    Join Date
    April 9, 2011
    Posts
    7,221

    The Automation Spiral (obligatory loleconomics thread v2)

    So...

    It's happening, the robots are coming for your jerbs, the algorithms are better at writing new algorithms and us nerds rejoice, as sex bots easily replace sex workers as our preferred isk sink.

    And as austerity (with a lot of neoliberalism on the side) lead to Brexit and Trump, join me, on another spiral as we watch what happen capitalism eats the very thing that gives it meaning.

    Here's some $500 popcorn - 100% untouched by human hands.


    https://www.theatlantic.com/business...talism/524943/
      Spoiler:
    A job advertisement celebrating sleep deprivation? That’s late capitalism. Free-wheeling Coachella outfits that somehow all look the same and cost thousands of dollars? Also late capitalism. Same goes for this wifi-connected $400 juicer that does no better than human hands, Pepsi’s advertisement featuring Kendall Jenner, United Airlines’ forcible removal of a seated passenger who just wanted to go home, and the glorious debacle that was the Fyre Festival. The phrase—ominous, academic, despairing, sarcastic—has suddenly started showing up everywhere.

    This publication has used “late capitalism” roughly two dozen times in recent years, describing everything from freakishly oversized turkeys to double-decker armrests for steerage-class plane seats. The New Yorker is likewise enamored of it, invoking it in discussions of Bernie Sanders and fancy lettuces, among other things. There is a wildly popular, year-old Reddit community devoted to it, as well as a Facebook page, a Tumblr, and a lively Twitter hashtag. Google search interest in its has more than doubled in the past year.

    “Late capitalism,” in its current usage, is a catchall phrase for the indignities and absurdities of our contemporary economy, with its yawning inequality and super-powered corporations and shrinking middle class. But what is “late capitalism,” really? Where did the phrase come from, and why did so many people start using it all of a sudden?

    For my own part, I vaguely remembered it coming from the writings of Karl Marx—the decadence that precedes the revolution? I polled a few friends, and they all sort of remembered the same thing, something to do with 19th-century Europeans and the inherent instability of the capitalist system. This collective half-remembering turned out to be not quite right. “It’s not Marx’s term,” William Clare Roberts, a political scientist at McGill University, told me.

    Rather, it was Marxist thinkers that came up with it to describe the industrialized economies they saw around them. A German economist named Werner Sombart seems to have been the first to use it around the turn of the 20th century, with a Marxist theorist and activist named Ernest Mandel popularizing it a half-century later. For Mandel, “late capitalism” denoted the economic period that started with the end of World War II and ended in the early 1970s, a time that saw the rise of multinational corporations, mass communication, and international finance. Roberts said that the term’s current usage departs somewhat from its original meaning. “It’s not this sense that things are getting so bad that the revolution is going to come,” he told me, “but rather that we see the ligaments of the international system that socialists will be able to seize and use.”

    Mandel did warn about the forces of automation, globalization, and wage stagnation, and feared that they would tear at the social fabric by making workers miserable. Still, during the period that he defined as “late capitalism,” the American middle class was flourishing and Europe was healing. “There is an irony to it being [originally] used to refer to the one time things were going well for a while,” Richard Yeselson, a contributing editor at Dissent and an expert on the labor movement, told me, riffing on the term.

    “Late capitalism” took on a darker connotation in the works of the 20th-century critical theorists, who borrowed from and critiqued and built on Marx and the Marxists. Members of the Frankfurt School, reeling from the horrors of World War II, saw in it excessive social control on the part of big government and big business. Theodor Adorno argued that “late capitalism” might lead not to socialism, but away from it, by blunting the proletariat’s potential for revolution. “The economic process continues to perpetuate domination over human beings,” he said in a speech on late capitalism in 1968. (If only he could have seen the Jenner-Pepsi ad.)

    It was Duke University’s Fredric Jameson who introduced the phrase to a broader English-speaking audience of academics and theorists. “It was a much older and more popular term in German,” Jameson told me. (Spätkapitalismus, for those wondering.) “It’s very interesting! It’s kind of—how should I say it—symptomatic of people’s feelings about the world. About society itself,” he said, a little surprised and a little chuffed to hear that the term was finding wider appreciation. “It used to be a sort of taboo outside of the left to even mention the word ‘capitalism.’ Now it’s pretty obvious that it’s there, and that’s what it is.”

    “It’s Trump. It’s Brexit. It’s whatever is going on in France. … And let’s give it more than a hint of foreboding. Late capitalism. Late is so pregnant.”
    In his canonical 1984 essay and 1991 book, both titled Postmodernism, or the Cultural Logic of Late Capitalism, Jameson argued that the globalized, post-industrial economy had given rise to postmodernist culture and art. Everything, everywhere, became commodified and consumable. High and low culture collapsed, with art becoming more self-referential and superficial. He told me he saw late capitalism as kicking into gear in the Thatcher and Reagan years, and persisting until today. “It has come out much more fully to the surface of things,” he said, citing the flash crash, derivatives, and “all this consumption by mail.”

    It took the emergence of a new, tenacious left to jailbreak it from the ivory tower and push it into wider use. In the wake of the Great Recession, the protestors of the Occupy movement occupied; the Sanders campaign found real, unexpected traction; the Fight for $15 helped convince 19 states and cities to boost their minimum wages up to $15 an hour. Editors and writers on the left founded or expanded publications like Jacobin, The New Inquiry, and n+1.

    Those cerebral outlets helped to fuel renewed interest in Marx and critical theory, as well as late capitalism. David Graeber, a leading figure in Occupy and the coiner of the phrase “We are the 99 percent,” for instance, wrote a long essay for The Baffler that touched on Jameson, Mandel, corporate profitability, flying cars, and, of course, late capitalism. The novel A Young Man’s Guide to Late Capitalism came out to good reviews in 2011. Pop-scholarly uses of the phrase started showing up in more mainstream publications, soaked up, as though by osmosis, from these publications and thinkers on the far left.

    The same happened on social media, itself growing rapidly as the recession gave way to the recovery. There were just a handful of mentions of “late capitalism” on Twitter before 2009, a few hundred in that year, and perhaps a few thousand in the next, many referring to college coursework.

    Over time, the semantics of the phrase shifted a bit. “Late capitalism” became a catchall for incidents that capture the tragicomic inanity and inequity of contemporary capitalism. Nordstrom selling jeans with fake mud on them for $425. Prisoners’ phone calls costing $14 a minute. Starbucks forcing baristas to write “Come Together” on cups due to the fiscal-cliff showdown.

    This usage captures the resurgent left’s anger over the recovery and the inequality that long preceded it—as well as the rage of millions of less politically engaged Americans who nevertheless feel left out and left behind. “I think it’s popular again now because the financial crisis and subsequent decade has really stripped away a veneer on what’s going on in the economy,” Mike Konczal, a fellow at the Roosevelt Institute, told me. “Austerity, runaway top incomes, globalization, populations permanently out of the job market, competition pushed further into our everyday lives. These aren’t new, but they have an extra cruelty that is boiling over everywhere.”

    The current usage also captures the perceived froth and foolishness of Silicon Valley. The gig economy in particular provides plenty of late-capitalist fodder, with investors showering cash on platforms to create cheap services for the rich and lazy and no-benefit jobs for the eager and poor. At the same time, traditional jobs seem to be providing less in the way of security, stability, and support, too. “There’s this growing discussion about how work is changing,” Carrie Gleason of the Fair Workweek Initiative told me. “The idea of what stable employment is, or what we can expect in terms of stable employment, is changing. That’s part of this.”

    “Late capitalism” skewers inequality, whether businesses’ feverish attempts to sell goods to the richest of the rich (here’s looking at you, $1,200 margarita) or to provide less and less to the rest (hey, airlines that make economy customers board after pets). It lampoons brands’ attempts to mimic or co-opt the language, culture, and content of their customers. Conspicuous minimalism, curated and artificial moments of zen, the gaslighting of the lifehacking and wellness movements: This is all late capitalism.

    Finally, “late capitalism” gestures to the potential for revolution, whether because the robots end up taking all the jobs or because the proletariat finally rejects all this nonsense. A “late” period always comes at the end of something, after all. “It has the constant referent to revolution,” Roberts said. “‘Late capitalism’ necessarily says, ‘This is a stage we’re going to come out of at some point, whereas ‘neoliberalism’ doesn’t say that, ‘Shit is fucked up and bullshit’ doesn’t say that. It hints at a sort of optimism amongst a post-Bernie left, the young left online. Something of the revolutionary horizon of classical Marxism.”

    It does all this with a certain concision, erudition, even beauty. It’s ominous and knowing, brainy and pissed-off. “Now is a crazy political time,” Yeselson said. “It’s Trump. It’s Brexit. It’s whatever is going on in France. Why talk about capitalism when nothing seems to be shaken up? But now things are shaken up. Let’s allude to the big, giant, totalistic system that is underneath everything. And let’s give it more than a hint of foreboding. Late capitalism. Late is so pregnant.”

    That it has strayed so far from its original meaning? Nobody I spoke with seemed to care, Jameson included, and the phrase has always had a certain malleability anyway. Sombart’s late capitalism differed from Mandel’s differed from Adorno’s differed from Jameson’s. “Late capitalism” often seems more like “the latest in capitalism,” Konczal quipped.

    This late capitalism is today’s, then. At least until the brands get ahold of it.


    Seriously though, it's everywhere because it's happening. And as politics devolves into which tribe can get the guy who gives enough nods and winks to them, into positions of power, Trump may look like gentle kite flying weather before the inevitable shit storm.

    Link to previous thread - if reading through 6 years and a thousand or so pages is your thing.
    Last edited by Pattern; May 2 2017 at 03:02:42 PM.
    Quote Originally Posted by Totally Not Larkonnis View Post
    at least we're not Greece.

  2. #2
    Keckers's Avatar
    Join Date
    July 31, 2012
    Posts
    24,445
    First for Marx was right.
    Look, the wages you withheld from the workmen who mowed your fields are crying out against you. The cries of the harvesters have reached the ears of the Lord of Hosts. You have lived on earth in luxury and self-indulgence. You have fattened yourselves for slaughter.

  3. #3
    Smuggo
    Guest
    Quote Originally Posted by Keckers View Post
    First for Marx was right.
    damnit!

  4. #4
    Shaikar's Avatar
    Join Date
    April 9, 2011
    Location
    Kador
    Posts
    3,246
    So...

    Why did we need a new thread?

  5. #5
    Keckers's Avatar
    Join Date
    July 31, 2012
    Posts
    24,445
    Quote Originally Posted by Shaikar View Post
    So...

    Why did we need a new thread?
    Austerity isn't as trendy as it once was.

    The new fashion is a race to the bottom for working conditions and automating the remaining workforce into poverty.
    Look, the wages you withheld from the workmen who mowed your fields are crying out against you. The cries of the harvesters have reached the ears of the Lord of Hosts. You have lived on earth in luxury and self-indulgence. You have fattened yourselves for slaughter.

  6. #6
    The Pube Whisperer Maximillian's Avatar
    Join Date
    April 10, 2011
    Location
    Sydney
    Posts
    4,605
    After that my guess is that you will never see social equity again. The greatest trick the 1% ever pulled was convincing the world that they are progressive. And like that... your future is gone.

  7. #7
    Donor Spawinte's Avatar
    Join Date
    April 9, 2011
    Location
    Ireland
    Posts
    6,596
    Quote Originally Posted by Shaikar View Post
    So...

    Why did we need a new thread?
    We didn't.

  8. #8
    Smuggo
    Guest
    chillax m8s we're not going through any thread austerity here on FHC. Make all the threads you want in this shitposting utopia.

  9. #9
    Donor erichkknaar's Avatar
    Join Date
    April 10, 2011
    Posts
    17,223
    The automation thing will happen. As it is, I practically need no IT people to run a business, luddism, misunderstanding and distrust non-withstanding. However, its going to take a while to get good at machines servicing machines and doing other complicated things like navigating spiral staircases.

    I suggest you all learn to service drones or Nest thermostats or something.
    meh

  10. #10
    Donor Pattern's Avatar
    Join Date
    April 9, 2011
    Posts
    7,221
    More context
    https://www.newscientist.com/article...ther-programs/
    OUT of the way, human, I’ve got this covered. A machine learning system has gained the ability to write its own code.

    Created by researchers at Microsoft and the University of Cambridge, the system, called DeepCoder, solved basic challenges of the kind set by programming competitions. This kind of approach could make it much easier for people to build simple programs without knowing how to write code.

    “All of a sudden people could be so much more productive,” says Armando Solar-Lezama at the Massachusetts Institute of Technology, who was not involved in the work. “They could build systems that it [would be] impossible to build before.”

    Ultimately, the approach could allow non-coders to simply describe an idea for a program and let the system build it, says Marc Brockschmidt, one of DeepCoder’s creators at Microsoft Research in Cambridge, UK.

    DeepCoder uses a technique called program synthesis: creating new programs by piecing together lines of code taken from existing software – just like a programmer might. Given a list of inputs and outputs for each code fragment, DeepCoder learned which pieces of code were needed to achieve the desired result overall.

    “It could allow non-coders to simply describe an idea for a program and let the system build it”
    One advantage of letting an AI loose in this way is that it can search more thoroughly and widely than a human coder, so could piece together source code in a way humans may not have thought of. What’s more, DeepCoder uses machine learning to scour databases of source code and sort the fragments according to its view of their probable usefulness.

    All this makes the system much faster than its predecessors. DeepCoder created working programs in fractions of a second, whereas older systems take minutes to trial many different combinations of lines of code before piecing together something that can do the job. And because DeepCoder learns which combinations of source code work and which ones don’t as it goes along, it improves every time it tries a new problem.

    The technology could have many applications. In 2015, researchers at MIT created a program that automatically fixed software bugs by replacing faulty lines of code with working lines from other programs. Brockschmidt says that future versions could make it very easy to build routine programs that scrape information from websites, or automatically categorise Facebook photos, for example, without human coders having to lift a finger

    “The potential for automation that this kind of technology offers could really signify an enormous [reduction] in the amount of effort it takes to develop code,” says Solar-Lezama.

    But he doesn’t think these systems will put programmers out of a job. With program synthesis automating some of the most tedious parts of programming, he says, coders will be able to devote their time to more sophisticated work.

    At the moment, DeepCoder is only capable of solving programming challenges that involve around five lines of code. But in the right coding language, a few lines are all that’s needed for fairly complicated programs.

    “Generating a really big piece of code in one shot is hard, and potentially unrealistic,” says Solar-Lezama. “But really big pieces of code are built by putting together lots of little pieces of code.”

    This article appeared in print under the headline “Computers are learning to code for themselves”
    Meanwhile

    https://www.technologyreview.com/s/6...e-ai-software/

    Progress in artificial intelligence causes some people to worry that software will take jobs such as driving trucks away from humans. Now leading researchers are finding that they can make software that can learn to do one of the trickiest parts of their own jobs—the task of designing machine-learning software.

    In one experiment, researchers at the Google Brain artificial intelligence research group had software design a machine-learning system to take a test used to benchmark software that processes language. What it came up with surpassed previously published results from software designed by humans.

    In recent months several other groups have also reported progress on getting learning software to make learning software. They include researchers at the nonprofit research institute OpenAI (which was cofounded by Elon Musk), MIT, the University of California, Berkeley, and Google’s other artificial intelligence research group, DeepMind.

    If self-starting AI techniques become practical, they could increase the pace at which machine-learning software is implemented across the economy. Companies must currently pay a premium for machine-learning experts, who are in short supply.

    Jeff Dean, who leads the Google Brain research group, mused last week that some of the work of such workers could be supplanted by software. He described what he termed “automated machine learning” as one of the most promising research avenues his team was exploring.

    “Currently the way you solve problems is you have expertise and data and computation,” said Dean, at the AI Frontiers conference in Santa Clara, California. “Can we eliminate the need for a lot of machine-learning expertise?”

    One set of experiments from Google’s DeepMind group suggests that what researchers are terming “learning to learn” could also help lessen the problem of machine-learning software needing to consume vast amounts of data on a specific task in order to perform it well.

    The researchers challenged their software to create learning systems for collections of multiple different, but related, problems, such as navigating mazes. It came up with designs that showed an ability to generalize, and pick up new tasks with less additional training than would be usual.

    The idea of creating software that learns to learn has been around for a while, but previous experiments didn’t produce results that rivaled what humans could come up with. “It’s exciting,” says Yoshua Bengio, a professor at the University of Montreal, who previously explored the idea in the 1990s.

    Bengio says the more potent computing power now available, and the advent of a technique called deep learning, which has sparked recent excitement about AI, are what’s making the approach work. But he notes that so far it requires such extreme computing power that it’s not yet practical to think about lightening the load, or partially replacing, machine-learning experts.

    Google Brain’s researchers describe using 800 high-powered graphics processors to power software that came up with designs for image recognition systems that rivaled the best designed by humans.

    Otkrist Gupta, a researcher at the MIT Media Lab, believes that will change. He and MIT colleagues plan to open-source the software behind their own experiments, in which learning software designed deep-learning systems that matched human-crafted ones on standard tests for object recognition.

    Gupta was inspired to work on the project by frustrating hours spent designing and testing machine-learning models. He thinks companies and researchers are well motivated to find ways to make automated machine learning practical.

    “Easing the burden on the data scientist is a big payoff,” he says. “It could make you more productive, make you better models, and make you free to explore higher-level ideas.”
    Quote Originally Posted by Totally Not Larkonnis View Post
    at least we're not Greece.

  11. #11
    Donor Sparq's Avatar
    Join Date
    April 11, 2011
    Location
    Strayastan
    Posts
    10,033
    Quote Originally Posted by erichkknaar View Post
    I suggest you all learn to service drones or Nest thermostats or something.
    or join me * in the exciting (that is to say, secure from the perils of automation) field of Psychology, where we bug-hunt software for meat sacks.

    * assuming I even complete my degree
    Last edited by Sparq; May 2 2017 at 03:57:23 PM.

  12. #12

    Join Date
    November 5, 2011
    Posts
    14,099
    Quote Originally Posted by Duckslayer View Post
    Quote Originally Posted by Shaikar View Post
    So...

    Why did we need a new thread?
    We don't really need more than one thread in general as they all merge into one at some point anyway

  13. #13
    Donor erichkknaar's Avatar
    Join Date
    April 10, 2011
    Posts
    17,223
    Quote Originally Posted by Sparq View Post
    Quote Originally Posted by erichkknaar View Post
    I suggest you all learn to service drones or Nest thermostats or something.
    or join me * in the exciting (that is to say, secure from the perils of automation) field of Psychology, where we bug-hunt software for meat sacks.

    * assuming I even complete my degree
    Servicing humans is good. We are getting pretty good at implementing psych models and selecting humans with them now as well.
    meh

  14. #14
    Smuggo
    Guest
    Quote Originally Posted by erichkknaar View Post
    Quote Originally Posted by Sparq View Post
    Quote Originally Posted by erichkknaar View Post
    I suggest you all learn to service drones or Nest thermostats or something.
    or join me * in the exciting (that is to say, secure from the perils of automation) field of Psychology, where we bug-hunt software for meat sacks.

    * assuming I even complete my degree
    Servicing humans is good. We are getting pretty good at implementing psych models and selecting humans with them now as well.
    Would be easier if we all had barcodes.

  15. #15
    Donor erichkknaar's Avatar
    Join Date
    April 10, 2011
    Posts
    17,223
    Quote Originally Posted by Smuggo View Post
    Quote Originally Posted by erichkknaar View Post
    Quote Originally Posted by Sparq View Post
    Quote Originally Posted by erichkknaar View Post
    I suggest you all learn to service drones or Nest thermostats or something.
    or join me * in the exciting (that is to say, secure from the perils of automation) field of Psychology, where we bug-hunt software for meat sacks.

    * assuming I even complete my degree
    Servicing humans is good. We are getting pretty good at implementing psych models and selecting humans with them now as well.
    Would be easier if we all had barcodes.
    You will.
    meh

  16. #16
    Donor Pattern's Avatar
    Join Date
    April 9, 2011
    Posts
    7,221
    I weep at all the kids currently learning tensorflow, most already know their jobs will never exist.
    Quote Originally Posted by Totally Not Larkonnis View Post
    at least we're not Greece.

  17. #17
    Approaching Walrus's Avatar
    Join Date
    March 8, 2013
    Posts
    10,005
    Quote Originally Posted by Keckers View Post
    First for Marx was right.


    edit: obligatory 'thou shall not make a machine in the likeness of a human mind'

    Frank Herbert didnt realize he was writing a prediction of the future
    Last edited by Approaching Walrus; May 2 2017 at 04:08:54 PM.

  18. #18
    Liare's Avatar
    Join Date
    April 9, 2011
    Location
    Denmark
    Posts
    15,330
    Quote Originally Posted by Keckers View Post
    First for Marx was right.
    there are not enough up-votes in the world to give this what it deserves.

    Quote Originally Posted by Pattern View Post
    More context
    https://www.newscientist.com/article...ther-programs/
    OUT of the way, human, I’ve got this covered. A machine learning system has gained the ability to write its own code.

    Created by researchers at Microsoft and the University of Cambridge, the system, called DeepCoder, solved basic challenges of the kind set by programming competitions. This kind of approach could make it much easier for people to build simple programs without knowing how to write code.

    “All of a sudden people could be so much more productive,” says Armando Solar-Lezama at the Massachusetts Institute of Technology, who was not involved in the work. “They could build systems that it [would be] impossible to build before.”

    Ultimately, the approach could allow non-coders to simply describe an idea for a program and let the system build it, says Marc Brockschmidt, one of DeepCoder’s creators at Microsoft Research in Cambridge, UK.

    DeepCoder uses a technique called program synthesis: creating new programs by piecing together lines of code taken from existing software – just like a programmer might. Given a list of inputs and outputs for each code fragment, DeepCoder learned which pieces of code were needed to achieve the desired result overall.

    “It could allow non-coders to simply describe an idea for a program and let the system build it”
    One advantage of letting an AI loose in this way is that it can search more thoroughly and widely than a human coder, so could piece together source code in a way humans may not have thought of. What’s more, DeepCoder uses machine learning to scour databases of source code and sort the fragments according to its view of their probable usefulness.

    All this makes the system much faster than its predecessors. DeepCoder created working programs in fractions of a second, whereas older systems take minutes to trial many different combinations of lines of code before piecing together something that can do the job. And because DeepCoder learns which combinations of source code work and which ones don’t as it goes along, it improves every time it tries a new problem.

    The technology could have many applications. In 2015, researchers at MIT created a program that automatically fixed software bugs by replacing faulty lines of code with working lines from other programs. Brockschmidt says that future versions could make it very easy to build routine programs that scrape information from websites, or automatically categorise Facebook photos, for example, without human coders having to lift a finger

    “The potential for automation that this kind of technology offers could really signify an enormous [reduction] in the amount of effort it takes to develop code,” says Solar-Lezama.

    But he doesn’t think these systems will put programmers out of a job. With program synthesis automating some of the most tedious parts of programming, he says, coders will be able to devote their time to more sophisticated work.

    At the moment, DeepCoder is only capable of solving programming challenges that involve around five lines of code. But in the right coding language, a few lines are all that’s needed for fairly complicated programs.

    “Generating a really big piece of code in one shot is hard, and potentially unrealistic,” says Solar-Lezama. “But really big pieces of code are built by putting together lots of little pieces of code.”

    This article appeared in print under the headline “Computers are learning to code for themselves”
    Meanwhile

    https://www.technologyreview.com/s/6...e-ai-software/

    Progress in artificial intelligence causes some people to worry that software will take jobs such as driving trucks away from humans. Now leading researchers are finding that they can make software that can learn to do one of the trickiest parts of their own jobs—the task of designing machine-learning software.

    In one experiment, researchers at the Google Brain artificial intelligence research group had software design a machine-learning system to take a test used to benchmark software that processes language. What it came up with surpassed previously published results from software designed by humans.

    In recent months several other groups have also reported progress on getting learning software to make learning software. They include researchers at the nonprofit research institute OpenAI (which was cofounded by Elon Musk), MIT, the University of California, Berkeley, and Google’s other artificial intelligence research group, DeepMind.

    If self-starting AI techniques become practical, they could increase the pace at which machine-learning software is implemented across the economy. Companies must currently pay a premium for machine-learning experts, who are in short supply.

    Jeff Dean, who leads the Google Brain research group, mused last week that some of the work of such workers could be supplanted by software. He described what he termed “automated machine learning” as one of the most promising research avenues his team was exploring.

    “Currently the way you solve problems is you have expertise and data and computation,” said Dean, at the AI Frontiers conference in Santa Clara, California. “Can we eliminate the need for a lot of machine-learning expertise?”

    One set of experiments from Google’s DeepMind group suggests that what researchers are terming “learning to learn” could also help lessen the problem of machine-learning software needing to consume vast amounts of data on a specific task in order to perform it well.

    The researchers challenged their software to create learning systems for collections of multiple different, but related, problems, such as navigating mazes. It came up with designs that showed an ability to generalize, and pick up new tasks with less additional training than would be usual.

    The idea of creating software that learns to learn has been around for a while, but previous experiments didn’t produce results that rivaled what humans could come up with. “It’s exciting,” says Yoshua Bengio, a professor at the University of Montreal, who previously explored the idea in the 1990s.

    Bengio says the more potent computing power now available, and the advent of a technique called deep learning, which has sparked recent excitement about AI, are what’s making the approach work. But he notes that so far it requires such extreme computing power that it’s not yet practical to think about lightening the load, or partially replacing, machine-learning experts.

    Google Brain’s researchers describe using 800 high-powered graphics processors to power software that came up with designs for image recognition systems that rivaled the best designed by humans.

    Otkrist Gupta, a researcher at the MIT Media Lab, believes that will change. He and MIT colleagues plan to open-source the software behind their own experiments, in which learning software designed deep-learning systems that matched human-crafted ones on standard tests for object recognition.

    Gupta was inspired to work on the project by frustrating hours spent designing and testing machine-learning models. He thinks companies and researchers are well motivated to find ways to make automated machine learning practical.

    “Easing the burden on the data scientist is a big payoff,” he says. “It could make you more productive, make you better models, and make you free to explore higher-level ideas.”
    something something comparative advantage.

    of course, that doesn't actually prevent the future from becoming Elysium.

    Quote Originally Posted by Approaching Walrus View Post
    Quote Originally Posted by Keckers View Post
    First for Marx was right.


    edit: obligatory 'thou shall not make a machine in the likeness of a human mind'

    Frank Herbert didnt realize he was writing a prediction of the future
    the problems aren't smarter tools, you're jumping into the luddite fallacy, the problem is how the surplus of those tools is distributed.

    Brian Herbert's books in the dune universe, while different from his fathers, actually explore the butlerian jihad and it's reasons in depth, suffice to say, thinking machines are not the source of the problem.
    Viking, n.:
    1. Daring Scandinavian seafarers, explorers, adventurers, entrepreneurs world-famous for their aggressive, nautical import business, highly leveraged takeovers and blue eyes.
    2. Bloodthirsty sea pirates who ravaged northern Europe beginning in the 9th century.

    Hagar's note: The first definition is much preferred; the second is used only by malcontents, the envious, and disgruntled owners of waterfront property.

  19. #19
    Liare's Avatar
    Join Date
    April 9, 2011
    Location
    Denmark
    Posts
    15,330
    Quote Originally Posted by Duckslayer View Post
    Quote Originally Posted by Isyel View Post
    Quote Originally Posted by Duckslayer View Post
    Quote Originally Posted by Shaikar View Post
    So...

    Why did we need a new thread?
    We don't really need more than one thread in general as they all merge into one at some point anyway
    No they dont
    they do, aside from the paedo anime-getto most of the lolpolitics stuff has a fair bit of cross-pollination.
    Viking, n.:
    1. Daring Scandinavian seafarers, explorers, adventurers, entrepreneurs world-famous for their aggressive, nautical import business, highly leveraged takeovers and blue eyes.
    2. Bloodthirsty sea pirates who ravaged northern Europe beginning in the 9th century.

    Hagar's note: The first definition is much preferred; the second is used only by malcontents, the envious, and disgruntled owners of waterfront property.

  20. #20
    Keckers's Avatar
    Join Date
    July 31, 2012
    Posts
    24,445
    American Airline stock lost more value for paying its labour force more money than United lost for beating up a passenger.
    Last edited by Keckers; May 2 2017 at 04:36:28 PM.
    Look, the wages you withheld from the workmen who mowed your fields are crying out against you. The cries of the harvesters have reached the ears of the Lord of Hosts. You have lived on earth in luxury and self-indulgence. You have fattened yourselves for slaughter.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •