chrmjenkins
Apr 11, 03:26 PM
Does Arn write every single article on this forum?
No, my guess is Eric Slivka is on vacation or something. He writes the majority of MR articles unless arn specifically wanted to take that over.
No, my guess is Eric Slivka is on vacation or something. He writes the majority of MR articles unless arn specifically wanted to take that over.
kalun
Sep 18, 11:27 PM
As I is naught en Amerikan canned sumone plz tell mi wen tanksgifting is? :p
lol, 1337 sp3ak FTW!!
lol, 1337 sp3ak FTW!!
theBB
Aug 11, 07:28 PM
Confused.
Can somebody explain me the differences between the cellphone market between the US and Europe.
Will a 'iPhone' just be marketed to the US or worldwide (as the iPod does)?
Well, let's see, about 20 years ago, a lot of countries in Europe, Asia and elsewhere decided on a standard digital cell phone system and called it GSM. About 15 years ago GSM networks became quite widespread across these countries. In the meantime US kept on using analog cell phones. Motorola did not even believe that digital cell phone had much of a future, so it decided to stay away from this market, a decision which almost bankrupted the company.
US started rolling out digital service only about 10 years ago. As US government does not like to dictate private companies how to conduct their business, they sold the spectrum and put down some basic ground rules, but for the most part they let the service providers use any network they wished. For one reason or another, these providers decided go with about 4 different standards at first. Quite a few companies went with GSM, AT&T picked a similar, but incompatible TDMA (IS=136?) standard, Nextel went with a proprietary standard they called iDEN and Sprint and Verizon went with CDMA, a radically different standard (IS-95) designed by Qualcomm. At the time, other big companies were very skeptical, so Qualcomm had to not only develop the underlying communication standards, but manufacture cell phones and the electronics for the cell towers. However, once the system proved itself, everybody started moving in that direction. Even the upcoming 3G system for these GSM networks, called UMTS, use a variant of CDMA technology.
CDMA is a more complicated standard compared to GSM, but it allows the providers to cram more users into each cell, it is supposedly cheaper to maintain and more flexible in some respects. However, anybody in that boat has to pay hefty royalties to Qualcomm, dampening its popularity. While creating UMTS, GSM standards bodies did everything they could to avoid using Qualcomm patents to avoid these payments. However, I don't know how successful they got in these efforts.
Even though Europeans here on these forums like to gloat that US did not join the worldwide standard, that we did not play along, that ours is a hodge podge of incompatible systems; without the freedom to try out different standards, CDMA would not have the opportunity to prove its feasibility and performance. In the end, the rest of the world is also reaping the benefits through UMTS/WCDMA.
Of course, not using the same standards as everybody else has its own price. The components of CDMA cell phones cost more and the system itself is more complicated, so CDMA versions of cell phones hit the market six months to a year after their GSM counterparts, if at all. The infrastructure cost of a rare system is higher as well, so AT&T had to rip apart its network to replace it with GSM version about five years after rolling it out. Sprint is probably going to convert Nextel's system in the near future as well.
I hope this answers your question.
Can somebody explain me the differences between the cellphone market between the US and Europe.
Will a 'iPhone' just be marketed to the US or worldwide (as the iPod does)?
Well, let's see, about 20 years ago, a lot of countries in Europe, Asia and elsewhere decided on a standard digital cell phone system and called it GSM. About 15 years ago GSM networks became quite widespread across these countries. In the meantime US kept on using analog cell phones. Motorola did not even believe that digital cell phone had much of a future, so it decided to stay away from this market, a decision which almost bankrupted the company.
US started rolling out digital service only about 10 years ago. As US government does not like to dictate private companies how to conduct their business, they sold the spectrum and put down some basic ground rules, but for the most part they let the service providers use any network they wished. For one reason or another, these providers decided go with about 4 different standards at first. Quite a few companies went with GSM, AT&T picked a similar, but incompatible TDMA (IS=136?) standard, Nextel went with a proprietary standard they called iDEN and Sprint and Verizon went with CDMA, a radically different standard (IS-95) designed by Qualcomm. At the time, other big companies were very skeptical, so Qualcomm had to not only develop the underlying communication standards, but manufacture cell phones and the electronics for the cell towers. However, once the system proved itself, everybody started moving in that direction. Even the upcoming 3G system for these GSM networks, called UMTS, use a variant of CDMA technology.
CDMA is a more complicated standard compared to GSM, but it allows the providers to cram more users into each cell, it is supposedly cheaper to maintain and more flexible in some respects. However, anybody in that boat has to pay hefty royalties to Qualcomm, dampening its popularity. While creating UMTS, GSM standards bodies did everything they could to avoid using Qualcomm patents to avoid these payments. However, I don't know how successful they got in these efforts.
Even though Europeans here on these forums like to gloat that US did not join the worldwide standard, that we did not play along, that ours is a hodge podge of incompatible systems; without the freedom to try out different standards, CDMA would not have the opportunity to prove its feasibility and performance. In the end, the rest of the world is also reaping the benefits through UMTS/WCDMA.
Of course, not using the same standards as everybody else has its own price. The components of CDMA cell phones cost more and the system itself is more complicated, so CDMA versions of cell phones hit the market six months to a year after their GSM counterparts, if at all. The infrastructure cost of a rare system is higher as well, so AT&T had to rip apart its network to replace it with GSM version about five years after rolling it out. Sprint is probably going to convert Nextel's system in the near future as well.
I hope this answers your question.
zacman
Apr 6, 04:05 PM
Yeah, good luck to Android tablets without carrier BOGO deals, Apple carrier exclusivity, and greater retail distribution than Apple. None of these factors apply in the tablet market.
That's why Apple lost around 30% marketshare in less than two months when the Galaxy tab was released? You know: That's the tablet that runs an outdated phone OS and not even a tablet OS...
That's why Apple lost around 30% marketshare in less than two months when the Galaxy tab was released? You know: That's the tablet that runs an outdated phone OS and not even a tablet OS...
fenderbass146
Apr 8, 12:56 AM
I agree, this rumor is sketchy. It looks like they have one unreliable source. Still, I don't see why BB is good for Apple stuff unless the Apple store is too crowded.
I agree, if I am shopping for apple stuff, i would prefer an apple store, however there is a best buy every where. I live in northwest indiana, and the nearest apple store is 40 minutes away, and im sure a lot of people have it worse.. It would be absoutly idioitic of apple to quit supplying best buy because best buy has a longer reach then apple to more people.
I agree, if I am shopping for apple stuff, i would prefer an apple store, however there is a best buy every where. I live in northwest indiana, and the nearest apple store is 40 minutes away, and im sure a lot of people have it worse.. It would be absoutly idioitic of apple to quit supplying best buy because best buy has a longer reach then apple to more people.
faustfire
Sep 13, 12:54 PM
A bit pointless given that no software utilises the extra cores yet. But nice to know, I guess.
A lot of 3d programs will use as many cores as are available when rendering.
And I would say that the next versions of many programs will be better suited for multiple core processors.* They are way too common for software developers to ignore them any longer.
A lot of 3d programs will use as many cores as are available when rendering.
And I would say that the next versions of many programs will be better suited for multiple core processors.* They are way too common for software developers to ignore them any longer.
tortoise
Aug 7, 06:14 PM
Why not just improve the Backup program that comes with .Mac or include it for free? Do we really need another interface? To me it looks like form over function.
You are out of your mind. A true versioning file system is insanely useful, and has been a Holy Grail file system feature that has not existed largely because it requires some significant unused disk space and disk performance to use it -- it is not a cheap feature to implement. Once you have it and applications start to use its functionality it will be like the internet: you will wonder how you got on in the computer world without it.
I do not care how they presented it, if it works as advertised then it is a "killer app" that will cause many people to part with their hard-earned money (myself included).
You are out of your mind. A true versioning file system is insanely useful, and has been a Holy Grail file system feature that has not existed largely because it requires some significant unused disk space and disk performance to use it -- it is not a cheap feature to implement. Once you have it and applications start to use its functionality it will be like the internet: you will wonder how you got on in the computer world without it.
I do not care how they presented it, if it works as advertised then it is a "killer app" that will cause many people to part with their hard-earned money (myself included).
appleguy123
Feb 28, 07:18 PM
Do you realize how incredibly rare paedophilia is? Also the Media is stupid and uses the wrong words intentionally. Truth, outright slanderous lies, what's the difference if it sells copies eh?
I wasn't around in the 1970's, but I'm pretty sure that pedophilia wasn't normal then.
Some of this may be media frenzy, but if even one child rapist is hidden by the Catholic Church, it doesn't reflect well on them.
"In the 1970s, pedophilia was theorized as something fully in conformity with man and even with children," the pope said. "It was maintained - even within the realm of Catholic theology - that there is no such thing as evil in itself or good in itself. There is only a 'better than' and a 'worse than.' Nothing is good or bad in itself."
I wasn't around in the 1970's, but I'm pretty sure that pedophilia wasn't normal then.
Some of this may be media frenzy, but if even one child rapist is hidden by the Catholic Church, it doesn't reflect well on them.
"In the 1970s, pedophilia was theorized as something fully in conformity with man and even with children," the pope said. "It was maintained - even within the realm of Catholic theology - that there is no such thing as evil in itself or good in itself. There is only a 'better than' and a 'worse than.' Nothing is good or bad in itself."
gnasher729
Jul 27, 05:37 PM
This is a positively thoughtless remark. No one's cheering the MHz myth on, in fact, Intel itself has abandoned the concept. Until the 3Ghz woodies get dropped in a MacPro, the 2.7 GHZ G5 will still be the fastest chip ever put in a Macintosh.
Assuming that you are talking about clock speed, there have been Macs running at over 3 GHz, just not for sale to the public. The Intel machines that were shipped to developers after WWDC 2005 had 3.4 GHz Pentium IVs.
Assuming that you are talking about clock speed, there have been Macs running at over 3 GHz, just not for sale to the public. The Intel machines that were shipped to developers after WWDC 2005 had 3.4 GHz Pentium IVs.
rezenclowd3
Aug 5, 01:06 PM
I think I will only enjoy the game should I buy a very expensive racing wheel. I already have the seat setup.... a racing game using the standard controller is just odd.
Pre-ordered the US Collectors edition, but now I would like to change that for the UK edition... I am ready to drop my pre-order should the multiplayer review prove lacking, and hopefully they give darn good penalties for hack driving online (DQ, through pit lane etc) Rubbing IS NOT racing! That's for drivers who do not have control of their car.
Pre-ordered the US Collectors edition, but now I would like to change that for the UK edition... I am ready to drop my pre-order should the multiplayer review prove lacking, and hopefully they give darn good penalties for hack driving online (DQ, through pit lane etc) Rubbing IS NOT racing! That's for drivers who do not have control of their car.
Iconoclysm
Apr 19, 08:28 PM
Apple may have expanded upon existing GUI elements, but it didn't invent the GUI. Very big difference there.
Interesting that you now notice the difference between the two when you started the entire discussion with your complete misunderstanding of someone already differentiating between the two...
Interesting that you now notice the difference between the two when you started the entire discussion with your complete misunderstanding of someone already differentiating between the two...
Liske
Aug 17, 02:42 PM
I have a new 3.0 Intel- just letting you know they are not as close as Rob's test under real world performance. Adobe camera raw really screamed on my G5 and is noticibly slower and a bit buggy on my new Mac Pro. Start up is alot slower, etc, etc. He only tested MP aware processes which isn't the whole picture.
The Photo Retouch artist test puts the Mac Pro 3.0 about 33% slower than the quad G5- but I think that test is skewered to the G5s liking. I think it's somewhere in the real world realm of 12% slower than my G5 quad. Not quite as good under Rosetta [5%?] that Rob posts, but not quite as bad as some other tester's results. The finder and other apps are noticebly faster, even against the fast quad.
I went for the mac pro as a web designer able to run windoze now. CS2 gets some but not alot of excersize. Other comparisons- the storage is awesome, super easy, super quiet. This machine is about 75% the noise of my G5, add the quiet firmtek 2 drive SATA i ran with the quad, and the Mac Pro is about 50% quieter. [By the way if anyone needs a 2 drive firmtek external SATA II case with PCIe card and cables, it is looking for a new home now. It was a great case for the g5 and is about 6 months old- http://www.amug.org/amug-web/html/amug/reviews/articles/firmtek/2en2/]
My 2 cents!
mac Pro 3.0
3bg ram
2 x 2 drive stripe raids
Std graphics card.
The Photo Retouch artist test puts the Mac Pro 3.0 about 33% slower than the quad G5- but I think that test is skewered to the G5s liking. I think it's somewhere in the real world realm of 12% slower than my G5 quad. Not quite as good under Rosetta [5%?] that Rob posts, but not quite as bad as some other tester's results. The finder and other apps are noticebly faster, even against the fast quad.
I went for the mac pro as a web designer able to run windoze now. CS2 gets some but not alot of excersize. Other comparisons- the storage is awesome, super easy, super quiet. This machine is about 75% the noise of my G5, add the quiet firmtek 2 drive SATA i ran with the quad, and the Mac Pro is about 50% quieter. [By the way if anyone needs a 2 drive firmtek external SATA II case with PCIe card and cables, it is looking for a new home now. It was a great case for the g5 and is about 6 months old- http://www.amug.org/amug-web/html/amug/reviews/articles/firmtek/2en2/]
My 2 cents!
mac Pro 3.0
3bg ram
2 x 2 drive stripe raids
Std graphics card.
Stridder44
Apr 8, 01:12 AM
To be fair - Apple themselves were doing the same thing - in the UK at least.
I experienced, on a number of occasions, Apple Stores actually had stock in store available for reservation, but were forcing an entirely unnecessary, half an hour 'unboxing and setup' appointment.
That just defies all reason. I mean it's not like they need to create more demand for these things.
I experienced, on a number of occasions, Apple Stores actually had stock in store available for reservation, but were forcing an entirely unnecessary, half an hour 'unboxing and setup' appointment.
That just defies all reason. I mean it's not like they need to create more demand for these things.
afrowq
Apr 6, 08:50 PM
If your sector of the business has decided to move to Premier because it works for them, awesome- but don't paint it as an industry trent. Cause I've seen zero migration from FCP to PP in Toronto post houses. Pro editing is still a two horse race: AVID and FCP.
And I can't help but think how ironic it will be if the new FCS will be built on AV Foundation, which was pioneered on your hated "itoys".
http://www.philiphodgetts.com/2011/02/a-new-64-bit-final-cut-pro/
Never said it was an industry-wide trent (sic). I said "a lot of professionals" have made the switch.
Thanks.
And I can't help but think how ironic it will be if the new FCS will be built on AV Foundation, which was pioneered on your hated "itoys".
http://www.philiphodgetts.com/2011/02/a-new-64-bit-final-cut-pro/
Never said it was an industry-wide trent (sic). I said "a lot of professionals" have made the switch.
Thanks.
bibbz
Jun 14, 07:21 PM
I called all 3 corporate stores in my county and none of the managers knew about the conference call and none of them know how they're going to handle pre-orders tomorrow morning. They ASSUME it will be the way they pre-sold Evo phones which was with a $50 deposit. But they don't know anything about PIN numbers or anything else the East Texas administration or national is telling you. One manager reported there's a pre-opening conference call scheduled for California stores tomorrow morning. The other 2 didn't even report that to me. But just called my closest store and the manager says the 8:30 conference call tomorrow morning is a weekly event and nothing special for the pre-order instructions. :confused:
My district had our 8:30 conf call for tomorrow at 5pm today. There was the normal RS stuff, then all about the iphone launch. Some stores may not get the info til tomorrow bc it was communicated so late in the afternoon.
I just got off the phone with my local RadioShack. I was told that the PIN would not guarantee you a phone on launch day, but that the chances of getting one are VERY VERY good. I know the manager very well, and trust that whoever told him said the same thing. I'm assuming the calls were done on a regional, if not district level, as opposed to company wide to give people a chance to ask questions, so it seems that most likely personal interpretations came into play, causing the original message, whatever it may have been to get screwed up.
The original call was Area specific and all dms were on it. Then my DM held a call with us immediatly after the big call. Some DM's might hold this info til tomorrow. I have no idea why.
Why on earth would Radio Shack ask anyone
to stand on line tomorrow to get a PIN just
to stand on line again opening day to get a phone
for which you are not guaranteed for?
My point exactly. We wouldn't be doing it if you weren't guaranteed a phone. See my above text, and call your store again in the am.
My district had our 8:30 conf call for tomorrow at 5pm today. There was the normal RS stuff, then all about the iphone launch. Some stores may not get the info til tomorrow bc it was communicated so late in the afternoon.
I just got off the phone with my local RadioShack. I was told that the PIN would not guarantee you a phone on launch day, but that the chances of getting one are VERY VERY good. I know the manager very well, and trust that whoever told him said the same thing. I'm assuming the calls were done on a regional, if not district level, as opposed to company wide to give people a chance to ask questions, so it seems that most likely personal interpretations came into play, causing the original message, whatever it may have been to get screwed up.
The original call was Area specific and all dms were on it. Then my DM held a call with us immediatly after the big call. Some DM's might hold this info til tomorrow. I have no idea why.
Why on earth would Radio Shack ask anyone
to stand on line tomorrow to get a PIN just
to stand on line again opening day to get a phone
for which you are not guaranteed for?
My point exactly. We wouldn't be doing it if you weren't guaranteed a phone. See my above text, and call your store again in the am.
maclaptop
Apr 14, 04:48 PM
still, you cannot say the iphone is the best smartphone on the market, just as someone else can't say the atrix is the best. Different strokes for different folks!
+1
+1
braddouglass
Apr 6, 12:56 PM
A hard drive uses less than 2 Watts while reading or writing. Flash uses the same or more when it is used; it only has an advantage when it is not used, where the hard disk drive has to spend energy to keep the drive spinning (less than 1 Watt).
So I suppose that standby temp would be low. and that operation temp would be about the same as any other lap top. Sounds good to me haha.
All I want is a faster processor and a backlit keyboard and I'll be happy with it.
Already with Flash HD and 4GB ram it should be wicked fast, but I'd like an i5 at least...
So I suppose that standby temp would be low. and that operation temp would be about the same as any other lap top. Sounds good to me haha.
All I want is a faster processor and a backlit keyboard and I'll be happy with it.
Already with Flash HD and 4GB ram it should be wicked fast, but I'd like an i5 at least...
Trekkie
Sep 13, 05:42 PM
According to tha Anandtech article its likely that the Clovertown family will be clocked slower then the Woodcrests
clock speed isn't everything. workload dependant of course.
clock speed isn't everything. workload dependant of course.
shamino
Jul 14, 04:17 PM
According to Appleinsider, the Mac Pro would have 2 4x and 1 8x PCIe slots. I see two problems with this. (1) All higher-end PC mobos out now have at least 1 16x slot, some have 2 for SLI/Crossfire.
Re-read the article.
It says there will be three available slots - 2 4x and 1 8x. These are the slots that will not be used by factory-bundled devices.
The bundled ATI X1800/X1900 video card will be in a 16x slot. It probably won't physically fit anywhere else!
(2) Why only 3 slots? PCs have 6 or so (as did the Power Mac 9500 & 9600) with a few regular PCI slots.
4 slots. 3 unused. Not 3 total.
Most PCs don't have more slots, either. Sure you can find a few counter-examples, but 6-slot systems are not common. And with the exception of the PM 9500/9600, Apple has never shipped a 6-slot system. (The Quadra 950 had 5. Everything else shipped with 4 or less.)
Why would Apple shoot itself in the foot like this? The Mac Pro is supposed to be a lot better than all other PCs. It would be nice to have 2 16x lanes for SLI and a few PCI slots for older expansion cards and cards that don't need the bandwidth of PCIe. Besides, this is supposed to be a Pro Mac, which means professional people would want to add a bunch of cards, not just 3. I'd expect a person working in something like movie production would want to have dual graphics cards, a fiber channel card to connect to an xServe RAID and maybe an M-Audio sound card for audio input. Since I don't work in movie production, I wouldn't know, but it would make sense.
You seem to think that a Pro system must have the capability of accepting every hardware device ever invented. (And how do you do this without making the case six feet tall?)
Dual video cards are only used by gamers. I doubt gamers are going to be interested in buying one of these, for the same reason they don't buy other Macs - the software comes out for other platforms first.
As for FC interfaces, they can work fine in any of the available slots. And there's no need for audio cards when you've got S/PDIF optical audio in/out.
Remember also that a studio won't be doing both video and audio editing on the same console! The people who are expert at one job are not going to be expert at the other. And if your studio is so strapped for cash that the different editors have to share a single computer, then you're in pretty sad shape!
I don't think you realize what you're asking for. A system that is capable of performing all possible tasks at once is just unrealistic. Nobody will ever equip a system like that, because no user will have those kinds of requirements.
Even in the PC world, where more slots are common, you almost never find a system that has actually filled all those slots with devices.
Re-read the article.
It says there will be three available slots - 2 4x and 1 8x. These are the slots that will not be used by factory-bundled devices.
The bundled ATI X1800/X1900 video card will be in a 16x slot. It probably won't physically fit anywhere else!
(2) Why only 3 slots? PCs have 6 or so (as did the Power Mac 9500 & 9600) with a few regular PCI slots.
4 slots. 3 unused. Not 3 total.
Most PCs don't have more slots, either. Sure you can find a few counter-examples, but 6-slot systems are not common. And with the exception of the PM 9500/9600, Apple has never shipped a 6-slot system. (The Quadra 950 had 5. Everything else shipped with 4 or less.)
Why would Apple shoot itself in the foot like this? The Mac Pro is supposed to be a lot better than all other PCs. It would be nice to have 2 16x lanes for SLI and a few PCI slots for older expansion cards and cards that don't need the bandwidth of PCIe. Besides, this is supposed to be a Pro Mac, which means professional people would want to add a bunch of cards, not just 3. I'd expect a person working in something like movie production would want to have dual graphics cards, a fiber channel card to connect to an xServe RAID and maybe an M-Audio sound card for audio input. Since I don't work in movie production, I wouldn't know, but it would make sense.
You seem to think that a Pro system must have the capability of accepting every hardware device ever invented. (And how do you do this without making the case six feet tall?)
Dual video cards are only used by gamers. I doubt gamers are going to be interested in buying one of these, for the same reason they don't buy other Macs - the software comes out for other platforms first.
As for FC interfaces, they can work fine in any of the available slots. And there's no need for audio cards when you've got S/PDIF optical audio in/out.
Remember also that a studio won't be doing both video and audio editing on the same console! The people who are expert at one job are not going to be expert at the other. And if your studio is so strapped for cash that the different editors have to share a single computer, then you're in pretty sad shape!
I don't think you realize what you're asking for. A system that is capable of performing all possible tasks at once is just unrealistic. Nobody will ever equip a system like that, because no user will have those kinds of requirements.
Even in the PC world, where more slots are common, you almost never find a system that has actually filled all those slots with devices.
geerlingguy
Aug 16, 11:29 PM
That's great that Adobe apps runs well under Rosetta in the new Mac Pro.
It makes very tempting to buy one.
My only concern comes to any Rev.A of any hardware.
I'll wait and buy the next version of Mac Pro. I think then, even under Rosetta Adobe apps will fly in comparison to the Quad G5. Can't wait for the universal apps though.
Always a judicious choice. I know that my Dad had about 6 months of little gripes with his DP G5 (1st generation) because of fan and 'buzzing' problems. He was kind of a 'beta tester' of the new hardware until a firmware update fixed his main problems.
Plus, if the 1st generation turns out to be reliable, you could get a used 1st gen. machine for a nice deal once the 2nd gen. machines are released!
It makes very tempting to buy one.
My only concern comes to any Rev.A of any hardware.
I'll wait and buy the next version of Mac Pro. I think then, even under Rosetta Adobe apps will fly in comparison to the Quad G5. Can't wait for the universal apps though.
Always a judicious choice. I know that my Dad had about 6 months of little gripes with his DP G5 (1st generation) because of fan and 'buzzing' problems. He was kind of a 'beta tester' of the new hardware until a firmware update fixed his main problems.
Plus, if the 1st generation turns out to be reliable, you could get a used 1st gen. machine for a nice deal once the 2nd gen. machines are released!
logandzwon
Apr 25, 02:49 PM
OMG, have you heard? Apple is secretly spying on our TXT messages, contacts, and e-mail! Seriously! If someone stole my iPhone and guessed my passcode they would be able to look through this list of "contacts" and find out where I lived. They could even check my calendar and know when I'm not home and come rob me. Even if I put fake info in my contacts they can still see my e-mails! All they have to do is look through my e-mails and find a receipt, or shipping notice, or the bill for the power, or my cellphone bill, or my water bill and they would have my home address!
marksman
Mar 22, 11:29 PM
Someone give Android's UI and Playbook's UI huge recognition so Apple will change it's old grid-like UI.
I am not sure you are using "UI" correctly.
The iPad two does have some shortcomings, few of which are worth going to to here. However, the OS of these devices IS crucial and we are beginning to see iOS creaking slightly. In terms of looks and notifications,
I get the notification thing, but I keep seeing some people talking about the look of the interface of IOS being dated and I don't get it. It seems like a very young and inexperienced viewpoint. Wanting change solely for the sake of change. The UI for IOS works very well. I don't want it changed just because some people are bored of looking at it. This is something you realize as you get older and more experienced in life. Change just for the sake of change is not a great deal, most of the time.
Change for the sake of improved usability and function? I am all for it. Change of the UI just because they have used the same basic look for the UI for 5 years? No not really.
I can assure that doubling the 256MB of the first iPad is not enough for people that need a lot of multitask, like me.
Here I don't think you understand how "multitasking" works on IOS devices.
It is not really possible to do a "lot" of multi-tasking. There are only a certain number of APIs that can be used concurrently. Having a bunch of apps listed in the fast task switcher is not multi-tasking and it does not require more ram.
Android phones are selling more than iPhone.
iPhone has started a market, competitors are improving it.
iPad has started a market, competitors are improving it.
The problem is Android becomes the brand and all these hardware makers become a commodity. People who have an android phone look to get a new android phone. They don't look to get an upgrade to their current phone because no upgrade exists, because the hardware makers just come up with new dumb names for products six times a year.
On the other hand someone with an iPhone is going to upgrade to another iPhone and so on. The brand and name builds on itself. This only becomes a bigger advantage for Apple as time goes on... And as others have noted it is silly to compare the userbase of a free OS that is installed on 100s of different hardware products, and that of the market leader which has a massive market share advantage over the next biggest competitive handset, which is the iPhone.
People who own a Motorola Suxit V or a HTC Yourmomma have NOTHING in common other than they both might be running some variation (probably different) of the Android OS. Compared to two people owning iPhones, even different model iPhones, where the experience will be very similar and comparable.
If you just can't recognize how multitask works better with 1GB RAM and true background apps (QNX, Honeycomb), then you deserve to use a limited thing like an iPad.
If you don't like your battery life, you got a point. Perhaps you can just always have a long extension cord and then you got a real winner!
I've only bought the first iPad because there were no competitors at that time (and I hate netbooks), but now things are different. To be honest, A LOT different.
At this point and time there are still no real competitors. There is one copycat device out there that is inferior, and a couple more potentially coming out soon... but nothing is guaranteed.
People said that the iPhone was going to be the best phone out there, but the market is showing something different.
People say the iPad is the best tablet out there, but it seems that the market is going to show something different.
I think the market clearly shows the iPhone is the best phone out there. There is no other phone that comes anywhere close to selling as much as the iPhone. The iPad is worse, and will pretty much stay that way as all of the competitors are just clones of the iPad, and they don't have the advantage of a protected Verizon environment to move their product. They will have to compete against the iPad 2 for every sale they make.
With the shortages of iPad2's out there, and international sales about to start up, probably making it worse, if the Xoom, G Tabs and Playbooks are "close enough" (particularly for folks that are not avid Apple followers), they could get quite a few sales. At least that is my opinion. (And like everyone I have an @$$-hole too.):)
This is a good point. The supply chain deficit is really the only chance these clone machines have of making inroads. I suspect the supply issue will be resolved before anyone else gets to market though, so the only one who will benefit from it is the Xoom.
I am not sure you are using "UI" correctly.
The iPad two does have some shortcomings, few of which are worth going to to here. However, the OS of these devices IS crucial and we are beginning to see iOS creaking slightly. In terms of looks and notifications,
I get the notification thing, but I keep seeing some people talking about the look of the interface of IOS being dated and I don't get it. It seems like a very young and inexperienced viewpoint. Wanting change solely for the sake of change. The UI for IOS works very well. I don't want it changed just because some people are bored of looking at it. This is something you realize as you get older and more experienced in life. Change just for the sake of change is not a great deal, most of the time.
Change for the sake of improved usability and function? I am all for it. Change of the UI just because they have used the same basic look for the UI for 5 years? No not really.
I can assure that doubling the 256MB of the first iPad is not enough for people that need a lot of multitask, like me.
Here I don't think you understand how "multitasking" works on IOS devices.
It is not really possible to do a "lot" of multi-tasking. There are only a certain number of APIs that can be used concurrently. Having a bunch of apps listed in the fast task switcher is not multi-tasking and it does not require more ram.
Android phones are selling more than iPhone.
iPhone has started a market, competitors are improving it.
iPad has started a market, competitors are improving it.
The problem is Android becomes the brand and all these hardware makers become a commodity. People who have an android phone look to get a new android phone. They don't look to get an upgrade to their current phone because no upgrade exists, because the hardware makers just come up with new dumb names for products six times a year.
On the other hand someone with an iPhone is going to upgrade to another iPhone and so on. The brand and name builds on itself. This only becomes a bigger advantage for Apple as time goes on... And as others have noted it is silly to compare the userbase of a free OS that is installed on 100s of different hardware products, and that of the market leader which has a massive market share advantage over the next biggest competitive handset, which is the iPhone.
People who own a Motorola Suxit V or a HTC Yourmomma have NOTHING in common other than they both might be running some variation (probably different) of the Android OS. Compared to two people owning iPhones, even different model iPhones, where the experience will be very similar and comparable.
If you just can't recognize how multitask works better with 1GB RAM and true background apps (QNX, Honeycomb), then you deserve to use a limited thing like an iPad.
If you don't like your battery life, you got a point. Perhaps you can just always have a long extension cord and then you got a real winner!
I've only bought the first iPad because there were no competitors at that time (and I hate netbooks), but now things are different. To be honest, A LOT different.
At this point and time there are still no real competitors. There is one copycat device out there that is inferior, and a couple more potentially coming out soon... but nothing is guaranteed.
People said that the iPhone was going to be the best phone out there, but the market is showing something different.
People say the iPad is the best tablet out there, but it seems that the market is going to show something different.
I think the market clearly shows the iPhone is the best phone out there. There is no other phone that comes anywhere close to selling as much as the iPhone. The iPad is worse, and will pretty much stay that way as all of the competitors are just clones of the iPad, and they don't have the advantage of a protected Verizon environment to move their product. They will have to compete against the iPad 2 for every sale they make.
With the shortages of iPad2's out there, and international sales about to start up, probably making it worse, if the Xoom, G Tabs and Playbooks are "close enough" (particularly for folks that are not avid Apple followers), they could get quite a few sales. At least that is my opinion. (And like everyone I have an @$$-hole too.):)
This is a good point. The supply chain deficit is really the only chance these clone machines have of making inroads. I suspect the supply issue will be resolved before anyone else gets to market though, so the only one who will benefit from it is the Xoom.
xxBURT0Nxx
Apr 6, 10:31 AM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8G4 Safari/6533.18.5)
I have a 13" ultimate of the current generation. The limiting factor for me is the graphics, not the processor. so going to sandy bridge with the intel 3000 would be a less appealing machine for my uses than the current model. It's really too bad the sandy bridge macs are tied to those garbage integrated graphics.
only the 13" mbp has integrated graphics, they are not quite as good as the 320m on older models or in the current mba, but they are much better than integrated graphics of the past. All other mbp models come with the integrated graphics as well as a discrete graphics processor.
I have a 13" ultimate of the current generation. The limiting factor for me is the graphics, not the processor. so going to sandy bridge with the intel 3000 would be a less appealing machine for my uses than the current model. It's really too bad the sandy bridge macs are tied to those garbage integrated graphics.
only the 13" mbp has integrated graphics, they are not quite as good as the 320m on older models or in the current mba, but they are much better than integrated graphics of the past. All other mbp models come with the integrated graphics as well as a discrete graphics processor.
cecildk9999
Nov 28, 07:30 PM
I agree with pretty much everyone else here; this royalty notion won't fly with Apple being (for once) in the dominant market position. If Universal pulls their music/content, it'll all be downloaded illegally, since the Zune isn't about to replace the iPod as the must-have 'cool' item (even if Zune marketplace does offer the Universal catalog). Universal just wants Apple to throw them a bone.
No comments:
Post a Comment