Jump to content

Humodour

Members.
  • Posts

    3433
  • Joined

  • Last visited

  • Days Won

    3

Everything posted by Humodour

  1. Humodour

    it's tech

    lol. I guess I should expect something like that from you.
  2. Humodour

    it's tech

    Simple: speed. Developing AI is actually about developing two separate technologies: software and hardware. Now, as you know, we've got the software already - we've had some version of it for decades (metaheuristic search) and we can't go much further here without the hardware to back it up. Which is where memristors come in. While you don't need memristors for AI, they look like they'll be a powerful enabling technology because they offer the capability to increase speed by orders of magnitude at a time when conventional transistors have all but hit the 'brick wall'. Taks listed all the reasons. My materials science is rusty, but I believe the most important of them is that they increase speed by taking up far less area than transistors (in all dimensions). "Williams adds that memristors could be used to speed up microprocessors by synchronizing circuits that tend to drift in frequency relative to one another or by doing the work of many transistors at once." http://www.sciam.com/article.cfm?id=missin...-of-electronics Besides that, though, memristors appear to exhibit quirky 'learning'/adaptive abilities which resemble those seen in biological life: http://lanl.arxiv.org/abs/0810.4179v2 I'd say that's just the tip of the iceberg. Somewhat loosely on topic: genetic algorithms are about a programme breeding competing solutions, right? Well what about a programme that breeds competing programmes to solve a problem? Or a programme that does that, but on ITSELF (i.e. to evolve itself to be better at evolving other solutions - this is what evolution itself does.) Or what about genetic algorithms that don't evolve programmes, but hardware? http://en.wikipedia.org/wiki/Genetic_programming http://en.wikipedia.org/wiki/Evolvable_hardware AI by itself is fairly tame. It can learn, but it can't change it's own coding. It's no more than a human mind in a computer instead of a body. Humans can't (easily) change how their code (both DNA and neural net) works, and nor can your average AI. So the fears about AI's getting out of control and taking over the world are naive. But not impossible. If somebody coded not just an AI, but an AI that could change it's own code (which seems possible given the above, but an order of magnitude harder than creating an AI again), that would be something else.
  3. Humodour

    it's tech

    Shhh, taks. It's alright - there aren't any scarecrows on the Internet. Now sit back and relax with this soothing story about Hewlett-Packard's plans to begin manufacturing prototype memristor RAM in 2009.
  4. Yeah, Islamic charities in Iran are bent as all hell. I read a decent article on it a while ago, but I can't find it, so the wikipedia run-down will have to do: http://en.wikipedia.org/wiki/Bonyad
  5. Humodour

    it's tech

    Go on taks, tell us how you're going to solve 2^x = 4097 without logs. Don't let me down, man. I'm counting on you to come up with some madcap scheme of guesswork so you can claim some sort of vague intellectual superiority.
  6. *smacks karka upside the head* Stop trolling.
  7. Humodour

    it's tech

    You're stupid. If you don't know the exact value of 2^18 without a calculator, you should be a garbage collector. /taks
  8. Man, I love the Internet. Judging by the tone of customers, online retailers, news articles, forum threads, and even various game company heads, DRM has about as much of a future as the Australian Internet censorship scheme. And, interestingly, the case against DRM is for many of the same seasons people oppose the Internet filter.
  9. It's like for every flash of brilliance about this game, there's equally something that makes he facepalm. Which could still mean in the end that the game is totally awesome and perhaps even revolutionary, but it makes it hard for me to get excited or stay interested.
  10. Humodour

    it's tech

    I used to be a skeptic about AI, but in light of the fact that Moore's law has held steady, and all the new software and hardware advances we've made this decade alone, I'm finding it very hard to be skeptical these days. We're definitely less than a century away. How 'less' is the question. Supercomputing still has a little while to go before we can model the human brain: But 'a little while' in electronic speak isn't much under Moore's law (and that's at current pace - things like graphene transistors or memristors will probably violate Moore's law in a very good way). IBM is currently working on brain simulation right now. Interestingly, while it might take a few more decades to emulate/simulate the human brain, something like a mouse or fruit fly is something altogether less complex (but still complex), and I expect the first real artifical intelligence to be something akin to those.
  11. Humodour

    it's tech

    Either you're not coming across well in English or you're fairly ignorant about neural networks. Neural networks are about exactly the opposite of understanding how one node fits into the grand scheme. They're about holistic details and pattern emergence. You say we can't empirically understand neural networks, but I think you are mistaken. Neural networks are exactly the type of thing best suited for the scientific method. Trial and error. It's actually very similar to what some biologists do with things like E. coli because evolution is also an optimisation process (heck, genetic algorithms, anybody?). What you perhaps mean is that we can't know (or it's orders of magnitude harder to know) the exact details of the system as we could with GOFAI (good old fashion AI - symbolic manipulation, mostly) - we can't be 100% certain that our input will produce some desired output. But if you think that's a flaw than you don't understand AI. It's this deviation away from perfect knowledge of the system that is exactly what AI researchers are after because they are sacrificing consistency and reliability for fluidity and a degree randomness; the beauty of neural networks (and other optimisation techniques such as genetic algorithms) are that they learn, adapt. Ahh, I'm not going to go into a big huge rant about it, but I really hope you take a closer look at the worth of metaheuristic optimisation, because it's not about to leave us any time soon (for good reason!). I will leave you with this little teaser of the potential behind metaheuristic optimisation techniques, though: http://www.physorg.com/news82910066.html taks: Yeah it's pretty neat. I'm quite looking forward to it, but what's probably more awesome is that memristors are just one of many breakthroughs in electronics recently. There's been a huge amount of noise about graphene transistors, for example. Although, admittedly, graphene transistors don't offer the sort of paradigm shift of variable resistance.
  12. Eh, you're not making your case terribly well. An overrated game is something like Oblivion or NWN1. It sells a few million copies in the first 2 or 3 years then you never hear from it again. This, on the other hand, is a couple of people who found the game boring and assuming that because it didn't meat their tastes that means it's overrated (read: think they're a better judge of a game than most everybody else who played it). Half-Life 1 wasn't even hyped up. It had to contend with a bunch of far more well-known FPS's at release, and it didn't start out with a bang. The years preceding its release sold far more copies than its initial pitch. That doesn't happen with an over-hyped piece of mediocrity. So I know this is really a trivial non-issue, but it's rather bemusing to see somebody try and retroactively reclassify HL1 as 'overrated' (or more hilarious still, equate it with Doom - you might manage that with Quake 2, though).
  13. Heck, you know what kirottu? You're damn right. I don't trust myself to express how I feel for you with words. Let this suffice: B==========D~~~~~~~~~~
  14. Russian vampires with breasts.

    Very very.

  15. Oh hello there little emo child.

  16. Well Turkey will pave roads and America will build bridges.
  17. I don't really like some of the screenshots and interface (ugly yellow font and excess anti-aliasing/blur/something). They're too 'cartoony'.
  18. Yeah, you're right there. Learning any second language is a massive boost to your first language. For example, while I haven't actually 'learnt' any second language yet, but when I was just briefly teaching myself some Russian, I discovered what (in)definite articles were in English by Russian's lack of them. English: "The boy went to a shop." Pseudo-Russian: "Boy went to shop."
  19. Why would pirates dislike DRM? They generally download their games already loaded with the crack and never have to deal with DRM. Which just makes the situation all the more ironic.
  20. As I said, I respect that he found it boring - wasn't his cup of tea, but trying to justify that as the game being 'overrated' is like saying "I know better than the other 10 million people who consistently bought the game each year." Doesn't hold water, I'm afraid.
  21. You know, Lajciak, the best way to figure that out is to install Ubuntu (a dual-boot if you're unsure) and test it yourself for a week or so. That said, as far as I know, both Nvidia and ATI release Linux versions of their drivers. What would you use it for? As covered earlier, most games, aside from things like Quake 4 or those ported by third-parties (e.g. Descent 3 or JA2), rely on WINE to run on Linux. Still, I haven't had any problems with graphics drivers when gaming. If you're just watching movies or doing image and video editing, you'll find the driver support is fine. Probably the best aspect of Ubuntu is that pretty much every programme you want is available for free in the packacge repositories. If you want, say, a photoshop equivalent (Gimp), you don't have to search a web site or pay for it, you just load of the package manager and request it.
  22. WHile I don't disagree with this thoughy, I will just point out that I have played a lot of games in my life. Every time I have had problems with a game it is because of bugs, not DRM. My most recent experience is with Far Cry 2 with contains a gamestopping bug that affects quite a few users. I would trade DRM for bugfree games any day. What are you trying to say? That they should add DRM because it reduces bugs?
  23. To quote Gabe Newell, head of Valve: "As far as DRM goes, most DRM strategies are just dumb. The goal should be to create greater value for customers through service value (make it easy for me to play my games whenever and wherever I want to), not by decreasing the value of a product (maybe I'll be able to play my game and maybe I won't)."
  24. Well you're free to dislike the game, but you need to remember that doesn't mean the game is bad. I don't like GTA but I can tell the difference between a crap game and a game that doesn't appeal to me. I mean fair enough you found the game boring, but claiming that selling 10 million copies over 10 years is somehow evidence that the game is overrated is a bit bitter (and if anything I should think it means the opposite).
  25. I think it makes them overrated if you're a tripper like karka who hates Half-Life.
×
×
  • Create New...