Dark K
Jun 22, 09:07 AM
That is just ridicoulous, what, shipment will come the 23rd on midnight? If they tell you "come Thursday", that probably means that we won't get any, if Radioshack has indeed not receive any info, that means that selected stores will get the pre-orders plus some others.
twoodcc
Sep 19, 12:18 AM
well i hope that this happens....and that they make more changes with the MBP
jav6454
Feb 28, 04:48 PM
Well, I have nothing to say, but that the University has grounds for dismissal if difference of opinion arises.
True, you can argue the gay card, but in this case, they college played the rights cards to get rid of him. Was it the best choice? No.
Still, it's the college's right on who teaches or not; and seeing how it's a Catholic Church college, I'd say it was bound to happen.
True, you can argue the gay card, but in this case, they college played the rights cards to get rid of him. Was it the best choice? No.
Still, it's the college's right on who teaches or not; and seeing how it's a Catholic Church college, I'd say it was bound to happen.
cvaldes
Mar 22, 02:17 PM
Lack of Flash support is the achilles heel of iPad. I hope Jobs gets off his high horse and relents.
Every day that Flash doesn't live on smartphones and tablets (all manufacturers, not just Apple), more content moves from Flash to HTML5. The relevance of Flash decreases a little bit every single day.
I've been an iPod touch owner since 2007 and I've adapted quite well. I also have an iPad and the Skyfire web browser will do Flash movie conversion.
Lack of Flash on portable devices = not a big deal to Joe Consumer
Every day that Flash doesn't live on smartphones and tablets (all manufacturers, not just Apple), more content moves from Flash to HTML5. The relevance of Flash decreases a little bit every single day.
I've been an iPod touch owner since 2007 and I've adapted quite well. I also have an iPad and the Skyfire web browser will do Flash movie conversion.
Lack of Flash on portable devices = not a big deal to Joe Consumer
BBC B 32k
Jul 27, 10:13 AM
I am just waiting to pull the trigger and get myself a 20"er. Hurry up with those chips Mr Jobs. Ah and where has the wireless mouse/kbd option gone in the store? Maybe they will be free with the upgraded iMacs. :p
What a world away from the G5 iMacs these beasts will be. Still when (not if) they are out I will prob. start waiting for the chinless 23" wonder - my ideal requirement.
Must hold out...
What a world away from the G5 iMacs these beasts will be. Still when (not if) they are out I will prob. start waiting for the chinless 23" wonder - my ideal requirement.
Must hold out...
MattSepeta
Apr 27, 02:13 PM
1. You opened it in Illustrator, not InDesign.
2. After I opened it in Illustrator like you did it did reveal some interesting things. It seems that fields #20 and #22 are on individual layers.
http://img163.imageshack.us/img163/6643/picture1hz.png (http://img163.imageshack.us/i/picture1hz.png/)
Uploaded with ImageShack.us (http://imageshack.us)
I am fairly confident that rather than pointing to a conspiracy, this simply shows that when scanned, the operator had enabled some sort of "auto-text" option that attempted to read and convert then embed the raw text info in the PDF, as to make the text "selectable" in preview programs.
It only worked on certain text, as is par for the course.
2. After I opened it in Illustrator like you did it did reveal some interesting things. It seems that fields #20 and #22 are on individual layers.
http://img163.imageshack.us/img163/6643/picture1hz.png (http://img163.imageshack.us/i/picture1hz.png/)
Uploaded with ImageShack.us (http://imageshack.us)
I am fairly confident that rather than pointing to a conspiracy, this simply shows that when scanned, the operator had enabled some sort of "auto-text" option that attempted to read and convert then embed the raw text info in the PDF, as to make the text "selectable" in preview programs.
It only worked on certain text, as is par for the course.
John.B
Apr 6, 10:31 AM
Now just add that Thunderbolt port to the MBAs and I'll be first in line! :D
epitaphic
Aug 18, 11:46 PM
So you think they put an extra processor in across the line just to be able to say they had a quad? Even the AnandTech article you used as a source showed here (http://www.anandtech.com/mac/showdoc.aspx?i=2816&p=18) that PS took advantage of quad cores in Rosetta
Yes under some specific results the quad was a bit faster than the dual. Though with the combo of Rosetta+Photoshop its unclear what is causing the difference. However, if you compare the vast majority of the benchmarks, there's negligible difference.
Concerning Photoshop specifically, as can be experienced on a quad G5, the performance increase is 15-20%. A future jump to 8-core would theoretically be in the 8% increase mark. Photoshop (CS2) simply cannot scale adequately beyond 2 cores, maybe that'll change in Spring 2007. Fingers crossed it does.
Your points about latency and FSB are not separate negatives as you have made them. They are redundant theoretical concerns with implications of unclear practical significance.
I beg to differ. If an app or game is memory intensive, faster memory access does matter. Barefeats (http://barefeats.com/quad09.html) has some benchmarks on dual channel vs quad channel on the Mac Pro. I'd personally like to see that benchmark with an added Conroe system. If dual to quad channel gave 16-25% improvement, imagine what 75% increase in actual bandwidth will do. Besides, I was merely addressing your statements that Woodcrest is faster because of its higher speed FSB and higher memory bus bandwidth.
I am not worried. Everything anyone has come up with on this issue are taken from that same AnandTech article. Until I see more real-world testing, I will not be convinced. Also, I expect that more pro apps such as PS will be able to utilize quad cores in the near future, if they aren't already doing so. Finally, even if Conroe is faster, Woodcrest is fast enough for me ;).
Anandtech, at the moment, is the only place with a quad xeon vs dual xeon benchmark. And yes, dual Woodcrest is fast enough, but is it cost effective compared to a single Woodcrest/Conroe? It seems that for the most part, Mac Pro users are paying for an extra chip but only really utilizing it when running several CPU intensive apps at the same time.
I think you misread that. They were comparing Core 2 Extreme (not Woodcrest) and Conroe to see whether the increased FSB of the former would make much difference.
You're absolutely right about that, its only measuring the improvement over increased FSB. If you take into account FB-DIMM's appalling efficiency, there should be no increase at all (if not decrease) for memory intensive apps.
One question I'd like to put out there, if Apple has had a quad core mac shipping for the past 8 months, why would it wait til intel quads to optimize the code for FCP? Surely they must have known for some time before that that they would release a quad core G5 so either optimizing FCP for quads is a real bastard or they've been sitting on it for no reason.
Yes under some specific results the quad was a bit faster than the dual. Though with the combo of Rosetta+Photoshop its unclear what is causing the difference. However, if you compare the vast majority of the benchmarks, there's negligible difference.
Concerning Photoshop specifically, as can be experienced on a quad G5, the performance increase is 15-20%. A future jump to 8-core would theoretically be in the 8% increase mark. Photoshop (CS2) simply cannot scale adequately beyond 2 cores, maybe that'll change in Spring 2007. Fingers crossed it does.
Your points about latency and FSB are not separate negatives as you have made them. They are redundant theoretical concerns with implications of unclear practical significance.
I beg to differ. If an app or game is memory intensive, faster memory access does matter. Barefeats (http://barefeats.com/quad09.html) has some benchmarks on dual channel vs quad channel on the Mac Pro. I'd personally like to see that benchmark with an added Conroe system. If dual to quad channel gave 16-25% improvement, imagine what 75% increase in actual bandwidth will do. Besides, I was merely addressing your statements that Woodcrest is faster because of its higher speed FSB and higher memory bus bandwidth.
I am not worried. Everything anyone has come up with on this issue are taken from that same AnandTech article. Until I see more real-world testing, I will not be convinced. Also, I expect that more pro apps such as PS will be able to utilize quad cores in the near future, if they aren't already doing so. Finally, even if Conroe is faster, Woodcrest is fast enough for me ;).
Anandtech, at the moment, is the only place with a quad xeon vs dual xeon benchmark. And yes, dual Woodcrest is fast enough, but is it cost effective compared to a single Woodcrest/Conroe? It seems that for the most part, Mac Pro users are paying for an extra chip but only really utilizing it when running several CPU intensive apps at the same time.
I think you misread that. They were comparing Core 2 Extreme (not Woodcrest) and Conroe to see whether the increased FSB of the former would make much difference.
You're absolutely right about that, its only measuring the improvement over increased FSB. If you take into account FB-DIMM's appalling efficiency, there should be no increase at all (if not decrease) for memory intensive apps.
One question I'd like to put out there, if Apple has had a quad core mac shipping for the past 8 months, why would it wait til intel quads to optimize the code for FCP? Surely they must have known for some time before that that they would release a quad core G5 so either optimizing FCP for quads is a real bastard or they've been sitting on it for no reason.
wizard
Mar 26, 10:35 AM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8G4 Safari/6533.18.5)
It is pretty incredible that the ignorance around Mac OS releases never stops. For one thing if you loose data on a computer, the only person to blame is the one staring at you in the mirror.
Even the whine about nothing worthwhile for the user is a bit old and reflects what we heard about SL. Yet SL on my early 2008 MBP was a drastic improvement for the user right out of the box and just got better with each update. User facing features are the only reason to update, fixes to underlying facilities can go a long way to justifying the software update.
As to the server integration, it hasn't and never will be a product worth $500. It is great that Apple is adding support to the base install but people need to realize a few things. One is that Mac OS is UNIX, people need to get that through their heads. Thus Apples server product only really adds in what is already seen in many UNIX intallations in a base install. Speaking of which much of that functionality is well established open source. Second the pricing of "server" software seems to be tailored to fit the mentality of the corporate world, where they feel they need to pay big bucks for something trivial. It is no wonder that Linux as established itself as a server OS in the SOHO world and at some of the more forward thinking larger corporations. As others have pointed out the basics of UNIX have been around for ages now, very little new territory is being cleared here, thus little justification for up charges on server software.
Finally it is a bit cowardly to avoid the future because you see nothing of value there for you personally. It is frightenly similar to the attitude seen in those that cut their own wrists.
It is pretty incredible that the ignorance around Mac OS releases never stops. For one thing if you loose data on a computer, the only person to blame is the one staring at you in the mirror.
Even the whine about nothing worthwhile for the user is a bit old and reflects what we heard about SL. Yet SL on my early 2008 MBP was a drastic improvement for the user right out of the box and just got better with each update. User facing features are the only reason to update, fixes to underlying facilities can go a long way to justifying the software update.
As to the server integration, it hasn't and never will be a product worth $500. It is great that Apple is adding support to the base install but people need to realize a few things. One is that Mac OS is UNIX, people need to get that through their heads. Thus Apples server product only really adds in what is already seen in many UNIX intallations in a base install. Speaking of which much of that functionality is well established open source. Second the pricing of "server" software seems to be tailored to fit the mentality of the corporate world, where they feel they need to pay big bucks for something trivial. It is no wonder that Linux as established itself as a server OS in the SOHO world and at some of the more forward thinking larger corporations. As others have pointed out the basics of UNIX have been around for ages now, very little new territory is being cleared here, thus little justification for up charges on server software.
Finally it is a bit cowardly to avoid the future because you see nothing of value there for you personally. It is frightenly similar to the attitude seen in those that cut their own wrists.
M-O
Apr 6, 07:01 PM
Apple should forget intel and put a quad-core A6 chip in the MacBook Air. Re-architecture Mac OS to run on ARM (OS Xi) and rule the world.
it may sound crazy now, but you'll see. if anyone knows how to change architectures its Apple. we all know they've got OS X running on an iPad already it the labs.
it may sound crazy now, but you'll see. if anyone knows how to change architectures its Apple. we all know they've got OS X running on an iPad already it the labs.
daneoni
Aug 25, 03:52 PM
Another person who can never be satisfied.:rolleyes:
What is that even supposed to mean?
What is that even supposed to mean?
aafuss1
Aug 6, 05:26 PM
I think they'll go UDI instead of HDMI (and save fees). The really interesting question here though is HDCP and what means for all existing hardware including cinema displays...
HDMI is very common-as many brands have it now. Some PC's also use it. UDI is better-but not a lot of devices may have until 2007.
HDMI is very common-as many brands have it now. Some PC's also use it. UDI is better-but not a lot of devices may have until 2007.
SgtPepper12
Apr 27, 08:13 AM
Oh my god I knew it! Apple collects the data and does evil things with it! I can't imagine what kind of evil things they are going to do with it!
No, seriously, I really don't. Printing out huge posters with a map of your latest locations saying "LOOK AT WHERE THIS GUY WAS. HE WAS AT THE SUPERMARKET LATELY. HE SURELY BOUGHT SOME THINGS THERE, LIKE TOMATOES. YEAH THIS KIND OF THINGS." maybe.
Strange people.
No, seriously, I really don't. Printing out huge posters with a map of your latest locations saying "LOOK AT WHERE THIS GUY WAS. HE WAS AT THE SUPERMARKET LATELY. HE SURELY BOUGHT SOME THINGS THERE, LIKE TOMATOES. YEAH THIS KIND OF THINGS." maybe.
Strange people.
Mattsasa
Apr 6, 03:01 PM
I hope that number keeps rising; we need competition to not let Apple rest on it's laurels.
apple isn't resting on their laurels,
if that number rises... which it will, it just means less developers and apps for ios
apple isn't resting on their laurels,
if that number rises... which it will, it just means less developers and apps for ios
craznar
Apr 27, 08:57 AM
I know of no cell tower or wifi device that works up to 100 miles away.
Then you know little... :)
Some of the cells in western Queensland are up to 200km across.
Then you know little... :)
Some of the cells in western Queensland are up to 200km across.
rosalindavenue
Mar 31, 03:18 PM
Not a problem for me. HTC does a great job keeping phones updated.
Spoken like someone who never owned an Eris.
Spoken like someone who never owned an Eris.
Stridder44
Apr 10, 12:28 AM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8G4 Safari/6533.18.5)
This should be interesting.
This should be interesting.
manu chao
Aug 27, 05:31 AM
You're screwing up, intel. We don't want 300 trillion transistors on a 1 nm die. We want longer battery life. Idiots.
Don't blame Intel, blame Apple for not using the ULV versions of the Core Duo chips. There are other manufacturers which use them (otherwise it would not make much sense for Intel to offer them).
However, the battery life of these machines is maybe in the order of six hours only, for once because the screen, HD etc. still need the same amount of power. Making the screen smaller, using Intel graphics, maybe even a 1.8" HD, you can reduce power consumption further, most often manufacturers also reduce battery size at the same time to make the laptops lightweight, preventing you to see battery life numbers of ten hours.
Moreover, reports about machines using the ULV versions (and sometimes 1.8" HDs) do complain about the performance.
Don't blame Intel, blame Apple for not using the ULV versions of the Core Duo chips. There are other manufacturers which use them (otherwise it would not make much sense for Intel to offer them).
However, the battery life of these machines is maybe in the order of six hours only, for once because the screen, HD etc. still need the same amount of power. Making the screen smaller, using Intel graphics, maybe even a 1.8" HD, you can reduce power consumption further, most often manufacturers also reduce battery size at the same time to make the laptops lightweight, preventing you to see battery life numbers of ten hours.
Moreover, reports about machines using the ULV versions (and sometimes 1.8" HDs) do complain about the performance.
toddybody
Mar 26, 04:33 AM
So its like, the complete version er...not quite done yet, but nearly finished..
:confused:
:confused:
epitaphic
Sep 13, 12:47 PM
Anyone seen this?
http://images.dailytech.com/nimage/1775_large_longtermroadmap.png
The real architecture changes are coming June then June then June 2012. With derivatives in the years between.
So Merom(Merom Santa Rosa)/Conroe/Woodcrest(Clovertown) are the end of the road of separate chips. No more mobile/desktop/sever chip... all are the same (should expect mobiles to have the lowest MHz, then desktop, then toping out with server)
And what's interesting is that each architecture change will be a leap in performance similar to Pentium D to Conroe transition. (source) (http://www.dailytech.com/article.aspx?newsid=2649)
Screw Tigerton, Penryn's next (probably June 2007)
http://images.dailytech.com/nimage/1775_large_longtermroadmap.png
The real architecture changes are coming June then June then June 2012. With derivatives in the years between.
So Merom(Merom Santa Rosa)/Conroe/Woodcrest(Clovertown) are the end of the road of separate chips. No more mobile/desktop/sever chip... all are the same (should expect mobiles to have the lowest MHz, then desktop, then toping out with server)
And what's interesting is that each architecture change will be a leap in performance similar to Pentium D to Conroe transition. (source) (http://www.dailytech.com/article.aspx?newsid=2649)
Screw Tigerton, Penryn's next (probably June 2007)
chatin
Aug 22, 09:08 PM
The Woodcrest processors have been put through their paces pretty well on the supercomputing lists, and their Achille's heal is the memory subsystem. Current generation AMD Opterons still clearly outscale Woodcrest in real-world memory bandwidth with only two cores. Unless Intel pulls a rabbit out of their hat with their memory architecture issues when the quad core is released, AMDs quad core is going to embarrass them because of the memory bottleneck. And AMD is already starting to work on upgrading their already markedly superior memory architecture.
This is one of the drawbacks of using a server CPU on the desktop. In lights-off Xserve this would not matter as most of the data is already cached in memory.
I think there might be lights out for future MacPro Xeons if AMD where to catch up in the race.
:rolleyes:
This is one of the drawbacks of using a server CPU on the desktop. In lights-off Xserve this would not matter as most of the data is already cached in memory.
I think there might be lights out for future MacPro Xeons if AMD where to catch up in the race.
:rolleyes:
CFreymarc
Apr 6, 04:18 PM
Oh yeah, well just wait until people find out iOS is a closed system and the Xoom uses Android which is open....
oh nevermind :D
"Hey babe, I just relinked the kernal of my tablet." is a line that really doesn't work.
"My girl, pet this." (iFur app runs on iPad) Yup iPads get you laid.
oh nevermind :D
"Hey babe, I just relinked the kernal of my tablet." is a line that really doesn't work.
"My girl, pet this." (iFur app runs on iPad) Yup iPads get you laid.
Enigmac
Aug 7, 03:24 PM
Remember guys, these are only a few of the MANY features that Leopard will have to offer... including the top secret one. Steve made that clear.
Zadillo
Aug 27, 06:04 AM
Damn PowerPC fans.
Apple is INTEL now. We Love Intel Because Stevie Tells Us So.
We hate AMD and IBM. Should Apple ever move to another CPU provider, we will seamlessly transition to hating Intel again. This is the Way of the Mac.
What's so good about G5's anyway? They are slow, too hot, and skull juice.
Why do we love Intel? Because Steve says to, and Core 2 Duo is powerful, cool, not permanently drunk, allows us to run Windows and helps Apple increase its market share.
We love ATi because just like Intel, their products are the best at the moment. We still love nVIDIA because their GPUs are in the Mac Pro.
We love Israel because they make our Core 2 Duos and we love China because they make our Macs. We love California because that's where Our Lord Stevie J is (Don't particularly care about the rest of the US, sorry guys).
We love our Big Cats because they run so fast and look so clean and powerful (Hmmm... Mystery of OS codenames revealed?) and of course because they are not Windows, which are susceptible to breaking...
People who live in Windows shouldn't throw Viruses?
Off track...
Anyway, Rawr to all you PowerPC fanboys (And girls)
Intel 4EVER!
I know this is just a joke, but even so it's stupid, because the implication is that the only reason anyone here might like the chips Intel is coming out with is because they have been brainwashed into liking them now that Apple uses them (i.e. if Apple was still using PowerPC chips, or had switched to AMD, we would all be sitting here talking about how crappy the Core 2 Duo chips are).
I'm sure there are some people like that, but it is insulting to plenty of people here who actually do know something about the various chips that Intel and AMD make and base their opinions on them just on their actual merits and weaknesses.
-Zadillo
Apple is INTEL now. We Love Intel Because Stevie Tells Us So.
We hate AMD and IBM. Should Apple ever move to another CPU provider, we will seamlessly transition to hating Intel again. This is the Way of the Mac.
What's so good about G5's anyway? They are slow, too hot, and skull juice.
Why do we love Intel? Because Steve says to, and Core 2 Duo is powerful, cool, not permanently drunk, allows us to run Windows and helps Apple increase its market share.
We love ATi because just like Intel, their products are the best at the moment. We still love nVIDIA because their GPUs are in the Mac Pro.
We love Israel because they make our Core 2 Duos and we love China because they make our Macs. We love California because that's where Our Lord Stevie J is (Don't particularly care about the rest of the US, sorry guys).
We love our Big Cats because they run so fast and look so clean and powerful (Hmmm... Mystery of OS codenames revealed?) and of course because they are not Windows, which are susceptible to breaking...
People who live in Windows shouldn't throw Viruses?
Off track...
Anyway, Rawr to all you PowerPC fanboys (And girls)
Intel 4EVER!
I know this is just a joke, but even so it's stupid, because the implication is that the only reason anyone here might like the chips Intel is coming out with is because they have been brainwashed into liking them now that Apple uses them (i.e. if Apple was still using PowerPC chips, or had switched to AMD, we would all be sitting here talking about how crappy the Core 2 Duo chips are).
I'm sure there are some people like that, but it is insulting to plenty of people here who actually do know something about the various chips that Intel and AMD make and base their opinions on them just on their actual merits and weaknesses.
-Zadillo
No comments:
Post a Comment