Not signed in (Sign In)
    • CommentTimeNov 5th 2009
    • CommentAuthorprof_k
    • CommentTimeNov 5th 2009
    The part about it that really gets to me is the "flash" trade orders that sometimes let them jump to the head of a queue of trades. Early on in news about the whole thing there were accusations that the trading firms that ran the backbone for electronic exchanges were improperly using their proximity to steal trade data ahead of competitors, but I never saw anything concrete about that.

    Also interesting is this Arstechnica article on how high-speed trading was driving innovation in computer hardware.
    • CommentAuthorprof_k
    • CommentTimeNov 5th 2009
    oh, and also last couple paragraphs in this from Charlie Stross as the whole thing ruined a work-in-progress of his (again: Madoff & co. had already wrecked that particular work once).
  1.  (7161.4)
    I expect that once knowledge of the techniques becomes widespread governments will have to ban it to keep investors convinced that stock prices aren’t just a product of digital manipulation. People distrust Wall Street already, if they decide the entire market is bullshit a lot of pissed of people will just forgo up their 401k contributions and buy bullion.

    Anyway, won’t there will be a point at which every bank, hedge fund, mutual fund, and small investor is doing this directly or through a service, the computers learn to ignore each other, and it stops working? Sort of like Search Engine Optimization—it works for a few weeks, and then suddenly you find yourself in 48,873,209th place and image searches only show Facebook albums of your interns getting drunk.
    • CommentAuthorFan
    • CommentTimeNov 5th 2009
    Bookstaber doesn't seem to have a particular scenario in mind, either, which is why he ends up (only half-seriously) proposing that everyone should just stop doing it for the greater good.

    As a programmer I've heard of it (the desire for speed in financial trading) for years: there have been people recruiting specific types of programming experience, in New York and London; but it seems to me useless, not a worth-while occupation.

    Is it even necessary that *everyone* should stop doing it for the greater good? Like, apparently it isn't necessary that *everybody* isn't a criminal or a drug addict: instead it's sufficient that *most* people find something more productive to do.

    Also interesting is this Arstechnica article on how high-speed trading was driving innovation in computer hardware.

    An interesting part of that article is, "... medical imaging, defense, oil and gas exploration, pharmaceutical research, 3D rendering for movies, and finance. The thing that always strikes me about these presentations is that this handful of compute-intensive problem domains is typically listed as "examples" of the sort of profitable niches you can target with a vendor's hardware, the implication being that there are just oodles of other such workload families out there that the presenter could adduce if he or she had enough space on the PowerPoint slide. Except that this isn't really the case. Those six items aren't just examples—they're basically the whole list, and this short list needs to grow, not shrink."

    They're pretty important examples, though:

    * medical imaging: exploring inside the human body
    * oil and gas exploration: exploring inside the earth's crust
    * pharmaceutical research: I guess that's molecular modelling, protein folding, etc.

    Anyway, I don't know whether the short list does need to grow. Chips evolve in other directions too, e.g. towards lower power consumption. Some (possibly most) of the world's need for increased processing comes from adding parallel capacity (e.g. more PCs, more cell phones, a larger number of intelligent devices, more networking) rather than making each processor faster. Some of the jobs which require a lot of processing (e.g., being Google) are managed just be having a lot of ordinary-powered (not too expensive) processors.