iMikeT
Aug 5, 07:36 PM
I can't wait!
rdowns
Apr 28, 08:04 AM
Step out of your little fairytale world
I loves me some irony.
I loves me some irony.
Popeye206
Apr 8, 08:20 AM
It's about time. Best Buy does not deserve the time of day - their employees are low, their service stinks, and their whole philosophy is unethical. Looks like it's starting to come back to haunt them now...
They were caught here on the east coast with a separate web site that hey would use when you came into the store to jack up prices. So you'd see a product on the web site for $X and go into the store and it's 10% higher, then they would show you on the fake site that it's the right price. A bait and switch routine.
I never heard any more about this and have been surprised. I would have thought that would have been their death with consumers. I know I won't buy from them if I can help it. Although I love to look there. :)
They were caught here on the east coast with a separate web site that hey would use when you came into the store to jack up prices. So you'd see a product on the web site for $X and go into the store and it's 10% higher, then they would show you on the fake site that it's the right price. A bait and switch routine.
I never heard any more about this and have been surprised. I would have thought that would have been their death with consumers. I know I won't buy from them if I can help it. Although I love to look there. :)
nilk
Apr 6, 04:14 PM
I run a Windows VM with 1 GB of dedicated memory and a Linux VM with 1.5 GB of dedicated memory. All while Xcode is open and doing something in every OS.
Seriously, software development is about the less ressource hungry task you can do on modern computers. Browsers use more system ressources nowadays than code editors/compilers/debuggers.
Totally depends on what tools you are using. Sure, when I'm at home working on a light webapp running nothing but Emacs, Chrome, Postgres, and using, for example, Python as my server-side language, 4GB of RAM is more than enough, hell I could get by with 2GB no problem.
But at work I have open: Eclipse, one or more instance of Tomcat or Jetty, Oracle SQL Developer (Java app), Windows VM with Visual Studio and other tools, and maybe a Linux VM running Oracle. I always have the Windows VM running. When I had 4GB, things would drag, and I couldn't run the Linux VM without my system becoming unusable. Now that I have 8GB things run great; I can afford to give my Windows VM over 2GB, and I don't notice the difference between running and not running my Linux VM. Sometimes I have as many as 3 VMs running using over 3GB RAM in total and things are still smooth unless there's a lot of hard drive access going on.
But it's encourage to know that you're successfully using a MBA w/ 4GB even with VMs eating up half your RAM. Maybe the SSD makes a huge difference.
Seriously, software development is about the less ressource hungry task you can do on modern computers. Browsers use more system ressources nowadays than code editors/compilers/debuggers.
Totally depends on what tools you are using. Sure, when I'm at home working on a light webapp running nothing but Emacs, Chrome, Postgres, and using, for example, Python as my server-side language, 4GB of RAM is more than enough, hell I could get by with 2GB no problem.
But at work I have open: Eclipse, one or more instance of Tomcat or Jetty, Oracle SQL Developer (Java app), Windows VM with Visual Studio and other tools, and maybe a Linux VM running Oracle. I always have the Windows VM running. When I had 4GB, things would drag, and I couldn't run the Linux VM without my system becoming unusable. Now that I have 8GB things run great; I can afford to give my Windows VM over 2GB, and I don't notice the difference between running and not running my Linux VM. Sometimes I have as many as 3 VMs running using over 3GB RAM in total and things are still smooth unless there's a lot of hard drive access going on.
But it's encourage to know that you're successfully using a MBA w/ 4GB even with VMs eating up half your RAM. Maybe the SSD makes a huge difference.
dempson
Mar 26, 03:23 PM
NB: For those English native speakers... which is the best subject when addressing a company, for instance, Apple/Microsoft...? I used "it" here, but sometimes I also use "they"... and I don't know which one is correct!
Both are acceptable. In the UK, Australia and New Zealand, the convention seems to be to refer to a company in the plural, i.e. "they". In the US, the convention seems to be to refer to a company in the singular, i.e. "it". To me (in New Zealand), "they" seems more natural because most companies involve multiple people.
Both are acceptable. In the UK, Australia and New Zealand, the convention seems to be to refer to a company in the plural, i.e. "they". In the US, the convention seems to be to refer to a company in the singular, i.e. "it". To me (in New Zealand), "they" seems more natural because most companies involve multiple people.
Heilage
Mar 1, 06:23 AM
I have no right to condemn anyone to hell.
If heaven were very crowded, it wouldn't be very heavenly, would it?
Fair point. Then again, if one makes the assumption that Heaven is full of people with ideas like yours, I'd rather stay here or in Hell. Which is basically the same thing anyway. :p
If heaven were very crowded, it wouldn't be very heavenly, would it?
Fair point. Then again, if one makes the assumption that Heaven is full of people with ideas like yours, I'd rather stay here or in Hell. Which is basically the same thing anyway. :p
AngryCorgi
Apr 6, 04:16 PM
Since you have no clue how the sandy bridge airs will perform, I'll take your statement as FUD.
I'll give you some insight into their potential. The desktop i7-2600k has been benchmarked to be roughly equivalent to a 9400m in performance (assuming similar CPU).
i7-2600k GPU clock = 850/1350 (normal/turbo)(MHz)
i5-2410m (13" Mac Pro base) GPU clock = 650/1200 (normal/turbo)(MHz)
i7-2620m (13" Mac Pro upg) GPU clock = 650/1300 (normal/turbo)(MHz)
i5-2537m (theorized 11/13 MBA) GPU clock = 350/900 (normal/turbo)(MHz)
i7-2649m (theorized 13 MBA upg) GPU clock = 500/1100 (normal/turbo)(MHz)
As you can see, none of the mobile GPUs run quite as fast as the desktop, but the 13" 2.7GHz upg cpu's comes fairly close. Now, the 2.13 GHz MBA + 320m combo matched or beat out the i7-2620m in 75% of the tests (and only narrowly was defeated in 25%). There is going to be some random inconcistancy regardless, due to driver variances in different apps. The issue here is (and this can be shown in core2 vs. i5/i7 testing on the alienware m11x) the core2 duo really very rarely gets beat by the i5/i7 in gaming/video playback. This is because not many games are single-threaded anymore, and if using 2+ threads, the i5/i7 ULV won't jump the clock speed any. Further, the 2.13GHz was keeping up with and beating a 2.7GHz (27% higher clock!) in that test, because graphics are the bottleneck, not the CPU. Take into account that NONE of the ULV core-i options match the MBP 13" 2.7GHz upg GPU speed and its pretty clear that for graphics-intensive apps, the older 320m would be the way to go. Now for most everything else, the i7-2649m would overtake the core2 2.13GHz. This includes a lot of non-accelerated video playback (high-CPU-overhead).
Something you guys need to be wary of is the 1333MHz memory topic. Likely, Apple will choose to run it down at 1066MHz to conserve battery life. Memory speed hikes = gratuitous battery drain.
I for one am happy Apple is growing with the modern tech, but I hold no illusions as to the benefits/drawbacks of either system.
I'll give you some insight into their potential. The desktop i7-2600k has been benchmarked to be roughly equivalent to a 9400m in performance (assuming similar CPU).
i7-2600k GPU clock = 850/1350 (normal/turbo)(MHz)
i5-2410m (13" Mac Pro base) GPU clock = 650/1200 (normal/turbo)(MHz)
i7-2620m (13" Mac Pro upg) GPU clock = 650/1300 (normal/turbo)(MHz)
i5-2537m (theorized 11/13 MBA) GPU clock = 350/900 (normal/turbo)(MHz)
i7-2649m (theorized 13 MBA upg) GPU clock = 500/1100 (normal/turbo)(MHz)
As you can see, none of the mobile GPUs run quite as fast as the desktop, but the 13" 2.7GHz upg cpu's comes fairly close. Now, the 2.13 GHz MBA + 320m combo matched or beat out the i7-2620m in 75% of the tests (and only narrowly was defeated in 25%). There is going to be some random inconcistancy regardless, due to driver variances in different apps. The issue here is (and this can be shown in core2 vs. i5/i7 testing on the alienware m11x) the core2 duo really very rarely gets beat by the i5/i7 in gaming/video playback. This is because not many games are single-threaded anymore, and if using 2+ threads, the i5/i7 ULV won't jump the clock speed any. Further, the 2.13GHz was keeping up with and beating a 2.7GHz (27% higher clock!) in that test, because graphics are the bottleneck, not the CPU. Take into account that NONE of the ULV core-i options match the MBP 13" 2.7GHz upg GPU speed and its pretty clear that for graphics-intensive apps, the older 320m would be the way to go. Now for most everything else, the i7-2649m would overtake the core2 2.13GHz. This includes a lot of non-accelerated video playback (high-CPU-overhead).
Something you guys need to be wary of is the 1333MHz memory topic. Likely, Apple will choose to run it down at 1066MHz to conserve battery life. Memory speed hikes = gratuitous battery drain.
I for one am happy Apple is growing with the modern tech, but I hold no illusions as to the benefits/drawbacks of either system.
Dunepilot
Aug 8, 04:03 AM
I'm glad that Leopard will be completely (that's what they say, at least) 64-bit. I'm not sure why it's important to go on about the applications as if they were important to the operating system itself. Increased integration like what was displayed would cause the anti-trust machine to whip into action, if it was Microsoft instead of Apple.
Time Machine is not exactly revolutionary, considering that there were a few 3rd party products available--Rewind comes to mind--that journaled changes and allowed them to be restored. Still, it should stop the various threads "I accidentally deleted..." :)
Hopefully, the features not mentioned will include a better kernel that actually performs well. It would be nice to see operating system benchmarks that don't make me cringe when I look at the Mac OS X results.
Xcode version 3.0 looks good but they still haven't provided many details.
Yeah, my first thought was - oh yeah, that's just like Rewind. However, the poweronsoftware.com website now forwards to http://www.nowsoftware.com/, so maybe Rewind has been bought out by Apple to use as Time Machine. Anyone know any more about this?
Dune
Time Machine is not exactly revolutionary, considering that there were a few 3rd party products available--Rewind comes to mind--that journaled changes and allowed them to be restored. Still, it should stop the various threads "I accidentally deleted..." :)
Hopefully, the features not mentioned will include a better kernel that actually performs well. It would be nice to see operating system benchmarks that don't make me cringe when I look at the Mac OS X results.
Xcode version 3.0 looks good but they still haven't provided many details.
Yeah, my first thought was - oh yeah, that's just like Rewind. However, the poweronsoftware.com website now forwards to http://www.nowsoftware.com/, so maybe Rewind has been bought out by Apple to use as Time Machine. Anyone know any more about this?
Dune
littleman23408
Dec 2, 08:43 AM
I hate to link to IGN, but here goes:
GT5 damage explained (http://ps3.ign.com/articles/113/1137446p1.html)
Confusion seems to have stemmed from its differing implementation across the game's extensive garage, a point that Sony further clarified. "Standard models have minor deformation and scratches," said Sony, "Premium cars have greater visible level of damage, and Premium racing models have the highest level of damage."
I can't open the links due to work internet, but they should have done equal damage to all cars. Besides, every real car dents and scratches pretty easily.
GT5 damage explained (http://ps3.ign.com/articles/113/1137446p1.html)
Confusion seems to have stemmed from its differing implementation across the game's extensive garage, a point that Sony further clarified. "Standard models have minor deformation and scratches," said Sony, "Premium cars have greater visible level of damage, and Premium racing models have the highest level of damage."
I can't open the links due to work internet, but they should have done equal damage to all cars. Besides, every real car dents and scratches pretty easily.
X2468
Mar 31, 08:09 PM
Wirelessly posted (Mozilla/5.0 (iPod; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Mobile/8G4)
Finally Google admits Jobs was right about fragmentation and recognises that to fight Apple it must become Apple. But it won't admit it. Prepare for lots of "closed is open and open is closed" stuff. Plus: the state of emergency justifying this closure is temporary: sort of like in Syria 50 years ago.
You know, I am truly sorry for the idealists in the open source community. They deserve better.
Were you attempting to make a point here?
Finally Google admits Jobs was right about fragmentation and recognises that to fight Apple it must become Apple. But it won't admit it. Prepare for lots of "closed is open and open is closed" stuff. Plus: the state of emergency justifying this closure is temporary: sort of like in Syria 50 years ago.
You know, I am truly sorry for the idealists in the open source community. They deserve better.
Were you attempting to make a point here?
janstett
Sep 15, 08:26 AM
And of course, NT started as a reimplementation of VMS for a failed Intel RISC CPU...
More pedantic details for those who are interested... :)
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.
More pedantic details for those who are interested... :)
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.
YEMandy
Aug 26, 06:37 PM
Could this mean an iMac update is coming soon as well? I ordered a loaded iMac two weeks ago and it still hasn't shipped yet. The estimated ship date is Aug. 28th with arrival on the 5th...
Orange-DE
Jul 31, 11:04 AM
You Do Dat!
BlizzardBomb
Aug 27, 09:49 AM
Well for one thing, Apple doesn't pay street prices. iMacs will only have 2 cores until Kentsfield. So I think it's fair to expct aggressive Conroe speed in the iMac due to the 2 core limitation. iMacs need to be about the same speed as Mac Pros because they only have 2 cores.
All pricing of chips are quoted in bulks of 1000s. And does it matter whether its street pricing or not because Apple still has to fork out an extra 30% for the CPU (+ logic board redesign costs).
All pricing of chips are quoted in bulks of 1000s. And does it matter whether its street pricing or not because Apple still has to fork out an extra 30% for the CPU (+ logic board redesign costs).
Hellhammer
Aug 8, 04:29 AM
I bought GT PSP and its as if the developers actively tried to suck all the enjoyment out of the series.
I've seen several people saying that it's starting to be a car encyclopedia rather than an enjoyable racing game. I kinda agree with that. My last experience with GT is GT2 on PS1 I think but I'm looking forward on this game. Hopefully it will be what I expect, a good, solid driving game. I hope they have spent time on the actual driving too, not just with the cars and 3D stuff etc
I've seen several people saying that it's starting to be a car encyclopedia rather than an enjoyable racing game. I kinda agree with that. My last experience with GT is GT2 on PS1 I think but I'm looking forward on this game. Hopefully it will be what I expect, a good, solid driving game. I hope they have spent time on the actual driving too, not just with the cars and 3D stuff etc
Evangelion
Jul 15, 10:37 AM
1) This is all rumour and speculation...
2) At the price that OEMs charge for memory, less RAM is better. We can fill it with whatever we pick.
Let's see.... If I could choose between two identical compter, one having 512MB of RAM and costing $1799, and the other having 1GB of RAM and costing $1799, I should buy the one with less RAM because then I could "pick my own RAM"?
And do I have to remind you that Woodcrests use FB-DIMM RAM, and those aren't really available that widely yet.
2) At the price that OEMs charge for memory, less RAM is better. We can fill it with whatever we pick.
Let's see.... If I could choose between two identical compter, one having 512MB of RAM and costing $1799, and the other having 1GB of RAM and costing $1799, I should buy the one with less RAM because then I could "pick my own RAM"?
And do I have to remind you that Woodcrests use FB-DIMM RAM, and those aren't really available that widely yet.
wpotere
Apr 27, 12:31 PM
I suspected it was a copy, I've never trusted the president, and I probably never will.
Wow... You tap dance worse than Trump does. Just say it, you NEVER liked Obama and never wanted him as president. So your comments earlier were nothing but a lie.
Wow... You tap dance worse than Trump does. Just say it, you NEVER liked Obama and never wanted him as president. So your comments earlier were nothing but a lie.
Unspeaked
Sep 19, 10:56 AM
Just make a box on the front page that has a picture of a MBP and let it say "the fastest just got faster" or something.
The fastest?
If that were the case, no one here would be complaining...
The fastest?
If that were the case, no one here would be complaining...
logandzwon
Apr 19, 02:51 PM
The First Commercial GUI
http://img62.imageshack.us/img62/5659/star1vg.gif
Xerox's Star workstation was the first commercial implementation of the graphical user interface. The Star was introduced in 1981 and was the inspiration for the Mac and all the other GUIs that followed.
http://img217.imageshack.us/img217/7892/leopardpreviewdesktop4.jpghttp://img714.imageshack.us/img714/5733/xerox8010star.gif
-The Star was not a commercial product. Xerox didn't sell them. (Well eventually they did, but not as PCs. they were to be similar to what we'd call a terminal today.)
-the middle image is actually of an Apple Lisa. I think you were just showing as a comparison, but some people might think your saying it's a Star. It's not. It's a Lisa.
-Apple compensated Xerox for the ideas borrowed from the Star. SJ and the mac team were already working on the GUI before any of them ever saw the Star though. Also, Macintosh 1 wasn't a copy of the Star. In fact a lot of the stables of a modern GUI today were innovated by Apple for the Macintosh.
http://img62.imageshack.us/img62/5659/star1vg.gif
Xerox's Star workstation was the first commercial implementation of the graphical user interface. The Star was introduced in 1981 and was the inspiration for the Mac and all the other GUIs that followed.
http://img217.imageshack.us/img217/7892/leopardpreviewdesktop4.jpghttp://img714.imageshack.us/img714/5733/xerox8010star.gif
-The Star was not a commercial product. Xerox didn't sell them. (Well eventually they did, but not as PCs. they were to be similar to what we'd call a terminal today.)
-the middle image is actually of an Apple Lisa. I think you were just showing as a comparison, but some people might think your saying it's a Star. It's not. It's a Lisa.
-Apple compensated Xerox for the ideas borrowed from the Star. SJ and the mac team were already working on the GUI before any of them ever saw the Star though. Also, Macintosh 1 wasn't a copy of the Star. In fact a lot of the stables of a modern GUI today were innovated by Apple for the Macintosh.
runninmac
Aug 17, 01:01 AM
This is a very dumb question but is Photoshop running under rosetta in this test?
If Photoshop is that is nuts.
Oh, please believe it is.
:eek:
If Photoshop is that is nuts.
Oh, please believe it is.
:eek:
j-traxx
Apr 8, 06:02 AM
Why anyone would ever choose to buy an Apple product at Best Buy over the Apple Store is beyond me. :confused:
no apple stores in the state of south dakota. but we got bb
no apple stores in the state of south dakota. but we got bb
torbjoern
Mar 1, 04:22 AM
Isn't it all hormonal mishaps in the womb? Does your God control that? If so, he is predisposing people to sin, and isn't that unfair that not all are exposed to that disposition?
AFAIK, Christians have this idea of "inherited sin". The predisposal to sin doesn't come from God, but from Adam.
AFAIK, Christians have this idea of "inherited sin". The predisposal to sin doesn't come from God, but from Adam.
notabadname
Mar 22, 03:42 PM
To store data temporally. That is what RAM does.
I believe the question was about what App on the iPad 2 is hindered by the amount of RAM. What are you trying to do, with what App, that needs 1GB? If the RAM isn't enhancing the experience, than what is the point other than to increase cost? You could put 4GB in an iPad too, but you will likely notn use it (with the current 1/3 million Apps). So what is the magic number that works seamlessly for 99% of what people use the device for?
I believe the question was about what App on the iPad 2 is hindered by the amount of RAM. What are you trying to do, with what App, that needs 1GB? If the RAM isn't enhancing the experience, than what is the point other than to increase cost? You could put 4GB in an iPad too, but you will likely notn use it (with the current 1/3 million Apps). So what is the magic number that works seamlessly for 99% of what people use the device for?
Object-X
Sep 19, 12:31 AM
1. It's Merom. Not Memrom, Menron, Memron or even L. Ron.
You forgot Mormon.
You forgot Mormon.