Roessnakhan
Mar 22, 12:57 PM
I agree.
But who in their right minds would want to own something called a Playbook? :o
The iPad isn't exactly a name to write home about either. Then, neither is TouchPad, Xoom, or Galaxy Tab.
But who in their right minds would want to own something called a Playbook? :o
The iPad isn't exactly a name to write home about either. Then, neither is TouchPad, Xoom, or Galaxy Tab.
Sydde
Mar 21, 01:28 PM
Not that they're absolutely mutually exclusive, but I'm curious--how do you reconcile the first statement with the latter two?
It is in fact somewhat problematic. At present, the government is basically run by moneyed interests that supply the funding needed for the candidates to get into office (so that they can shower favours upon their benefactors and sponsors). This is the sixth check/balance, which was not literally codified but has become standard via legal precedent. Those of us who feel that real change is called for still support the (p)resident because he is the one least likely to enact tragic "progress". The system is, nonetheless, dreadfully broken. The idiots that I hold in serious contention are either marginalized into submission or holding seats of power, a situation that serves only to amplify our division, to our detriment.
It is in fact somewhat problematic. At present, the government is basically run by moneyed interests that supply the funding needed for the candidates to get into office (so that they can shower favours upon their benefactors and sponsors). This is the sixth check/balance, which was not literally codified but has become standard via legal precedent. Those of us who feel that real change is called for still support the (p)resident because he is the one least likely to enact tragic "progress". The system is, nonetheless, dreadfully broken. The idiots that I hold in serious contention are either marginalized into submission or holding seats of power, a situation that serves only to amplify our division, to our detriment.
Alexsaru
Sep 13, 06:54 AM
I was interested to see that they were unable to max out CPU utilization on all 8 cores in the system. I hope it's due to the software these days not being ready to fully utilize more than one or two cores and not due to OSX's ability to scale to larger core counts. Since that's obviously where we're heading. Does anyone know about the potential for scalability of OSX to large numbers of CPU's/cores? I know some *nix varieties and BSD varieties do this really well, but one wonders if they were thinking this far in the future when they developed OSX. It'll be interesting to see...
TheQuestion
Mar 26, 12:22 AM
Can't believe it's anywhere near GM time. Way too many bugs and inconsistencies in behavior. New networking tools in Server have to be implemented now that SMB is being canned - that's not a minor addition. Calling it a release candidate is a stretch, but calling it GM is just plain crazy.
Multimedia
Jul 21, 12:20 PM
It really depends on your application.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use...
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.I strongly disagree. I could use 16 cores right now for notihng more than simple consumer electronics video compression routines. There will be a Mac Pro with 8 cores this Winter 2007.
You are completely blind to the need for many cores right now for very simple stupid work. All I want to do is run 4 copies of Toast while running 4 copies of Handbrake simultaneously. Each wants 2 cores or more. So you are not thinking of the current need for 16 cores already.
This is not even beginning to discuss how many Final Cut Studio Editors need 16 Cores. Man, I can't believe you wrote that. I think you are overlooking the obvious - the need to run multiple copies of today's applicaitons simultaneously.
So as long as the heat issue can be overcome, I don't see why 8 Cores can't belong inside an iMac by the end of 2008.
I apologize if I read a little hot. But I find the line of thought that 4 or 8 Cores are enough or more than enough to really annoy me. They are not nearly enough for those of us who see the problem of not enough cores EVERY DAY. The rest of you either have no imagination or are only using your Macs for word processing, browsing and email.
I am sincerely frustrated by not having enough cores to do simple stupid work efficiently. Just look at how crippled this G5 Quad is already only running three things. They can't even run full speed due to lack of cores.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use...
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.I strongly disagree. I could use 16 cores right now for notihng more than simple consumer electronics video compression routines. There will be a Mac Pro with 8 cores this Winter 2007.
You are completely blind to the need for many cores right now for very simple stupid work. All I want to do is run 4 copies of Toast while running 4 copies of Handbrake simultaneously. Each wants 2 cores or more. So you are not thinking of the current need for 16 cores already.
This is not even beginning to discuss how many Final Cut Studio Editors need 16 Cores. Man, I can't believe you wrote that. I think you are overlooking the obvious - the need to run multiple copies of today's applicaitons simultaneously.
So as long as the heat issue can be overcome, I don't see why 8 Cores can't belong inside an iMac by the end of 2008.
I apologize if I read a little hot. But I find the line of thought that 4 or 8 Cores are enough or more than enough to really annoy me. They are not nearly enough for those of us who see the problem of not enough cores EVERY DAY. The rest of you either have no imagination or are only using your Macs for word processing, browsing and email.
I am sincerely frustrated by not having enough cores to do simple stupid work efficiently. Just look at how crippled this G5 Quad is already only running three things. They can't even run full speed due to lack of cores.
DeathChill
Mar 31, 09:52 PM
No, it's "make up a fake day" day.
Good. I declare it dog moustache day.
Good. I declare it dog moustache day.
citizenzen
Mar 18, 09:06 PM
I am very unhappy that Obama did not get us out of a state of War. Which pacifist do you plan on voting for this next time around?
What pacifist ever has a realistic chance of becoming the next "commander-in-chief"?
That's why 5P's contention is so ridiculous.
Candidates must paint themselves as "strong" and capable of leading our military, otherwise there'd be little chance they'd be elected as president.
What pacifist ever has a realistic chance of becoming the next "commander-in-chief"?
That's why 5P's contention is so ridiculous.
Candidates must paint themselves as "strong" and capable of leading our military, otherwise there'd be little chance they'd be elected as president.
notabadname
Mar 22, 03:45 PM
iPad: 1024x768
7.76� x 6.82�
45.2 square inches
PlayBook: 1024x600
3.54� x 6.04�
21.4 square inches
7.76� x 6.82�
45.2 square inches
PlayBook: 1024x600
3.54� x 6.04�
21.4 square inches
ten-oak-druid
Mar 22, 04:19 PM
Competition is good.
Make a case for your argument.
Make a case for your argument.
jonnysods
Apr 8, 06:04 AM
Seems like a pretty big slap on the wrist. Wonder if this is true....
osx11
Mar 22, 12:58 PM
.2 mm thinner?
let the war begin.
let the war begin.
dba7dba
Apr 20, 11:47 AM
After reading some of the lawsuit, I had to post this..
http://pk.funnyseoul.com/wp-content/uploads/2010/11/2010-11-04_174623.jpg
http://pk.funnyseoul.com/wp-content/uploads/2010/11/pn_20101104170853.jpg
http://pk.funnyseoul.com/2010/11/galaxy-tab-released/
Are you aware that Apple copied the ibooks GUI from another software vendor? I remember seeing it years (like in G4 era) before ipad was out, before iBook. It was for keeping inventory of books on a mac.
I'm not gonna bother going looking for the link/screen shot but trust me, that look was used by another software vendor, BEFORE apple used it. And of course that's one reason this wasn't mentioned in the suit I'm assuming.
Edit:
Actually here it is.
http://www.delicious-monster.com/
http://www.delicious-monster.com/images/librarypage/screenshots/inspector_0_topmatter.png
Won apple design award in 2005. And when was iBooks introduced?
http://pk.funnyseoul.com/wp-content/uploads/2010/11/2010-11-04_174623.jpg
http://pk.funnyseoul.com/wp-content/uploads/2010/11/pn_20101104170853.jpg
http://pk.funnyseoul.com/2010/11/galaxy-tab-released/
Are you aware that Apple copied the ibooks GUI from another software vendor? I remember seeing it years (like in G4 era) before ipad was out, before iBook. It was for keeping inventory of books on a mac.
I'm not gonna bother going looking for the link/screen shot but trust me, that look was used by another software vendor, BEFORE apple used it. And of course that's one reason this wasn't mentioned in the suit I'm assuming.
Edit:
Actually here it is.
http://www.delicious-monster.com/
http://www.delicious-monster.com/images/librarypage/screenshots/inspector_0_topmatter.png
Won apple design award in 2005. And when was iBooks introduced?
LC475
Apr 11, 06:49 PM
There's nothing to fear about Apple making FCP less than professional.
The thing to understand is that NLEs never change their basic structure of how editing works, i.e moving clips in the timeline, trimming, etc. Look at Avid - it hasn't changed much at all since the 90s because they know if they did, they would lose their base of users. Avid came in the early 90s, and FCP came in the late 90s. FCP is an improvement to the Avid idea of NLE editing, and it's a good improvement. That's one reason why it became popular. Sure, the GUI might change but the basic way of working will not. After Effects is a good example. The GUI looks totally different than it did on version 5, but you can still work basically the same.
I don't understand what people mean by FCP lagging behind Avid and Adobe. In the last couple years, FCP has been making strong gains in Hollywood. WB, 20th Fox, Paramount have all used FCP on major movies. I worked as an AE on one of them. Professionals like FCP, many movie editors I know like FCP, major post houses use it, and I'm sure after tomorrow we will like it even more.
If anything, FCP has become less of a consumer app and more of a professional one. Hollywood wouldn't have thought of using FCP in 1999 on version 1, but they're using it now. It's become more professional over the last ten years.
With the new technology of thunderbolt, 64bit support, and multithreading support, in addition to iPad support, we should see an awesome upgrade tomorrow.
The thing to understand is that NLEs never change their basic structure of how editing works, i.e moving clips in the timeline, trimming, etc. Look at Avid - it hasn't changed much at all since the 90s because they know if they did, they would lose their base of users. Avid came in the early 90s, and FCP came in the late 90s. FCP is an improvement to the Avid idea of NLE editing, and it's a good improvement. That's one reason why it became popular. Sure, the GUI might change but the basic way of working will not. After Effects is a good example. The GUI looks totally different than it did on version 5, but you can still work basically the same.
I don't understand what people mean by FCP lagging behind Avid and Adobe. In the last couple years, FCP has been making strong gains in Hollywood. WB, 20th Fox, Paramount have all used FCP on major movies. I worked as an AE on one of them. Professionals like FCP, many movie editors I know like FCP, major post houses use it, and I'm sure after tomorrow we will like it even more.
If anything, FCP has become less of a consumer app and more of a professional one. Hollywood wouldn't have thought of using FCP in 1999 on version 1, but they're using it now. It's become more professional over the last ten years.
With the new technology of thunderbolt, 64bit support, and multithreading support, in addition to iPad support, we should see an awesome upgrade tomorrow.
MacRumors
Mar 25, 10:25 PM
http://www.macrumors.com/images/macrumorsthreadlogo.gif (http://www.macrumors.com/2011/03/25/apple-already-nearing-golden-master-candidate-versions-of-mac-os-x-lion/)
http://images.macrumors.com/article/2011/03/25/232441-lion_mission_control.jpg
Trevor Ariza agreed to terms
free agent Trevor Ariza.
Trevor Ariza
against Trevor Ariza #3
Trevor Ariza Trevor Ariza #1
http://images.macrumors.com/article/2011/03/25/232441-lion_mission_control.jpg
Machead III
Sep 19, 09:27 AM
I hope that the MacBook with Core 2 Duo is better than the Core Duo version :)
I hope it's worse?
I hope it's worse?
Flowbee
Aug 11, 10:28 AM
Arrrggh... too many conflicting rumors make my head a splode. :eek:
4God
Jul 14, 03:56 PM
This means that the 2.7 GHz G5 of a year ago or more would still be a high for CPU speeds for the PowerMac/MacPro line. We already have dual dual 2.5 GHz G5 a year ago. An increase to 2.66 GHz means that either 2008 or 2009 we will see the promised 3 GHz PowerMac/MacPro.
Any bets on which year it will be?
Bill the TaxMan
I think we'll see more cores per cpu before we see 3GHz. IMHO, 4,8 or more cores at 2.66 is far better than 1 or 2 cores at 3GHz.
Any bets on which year it will be?
Bill the TaxMan
I think we'll see more cores per cpu before we see 3GHz. IMHO, 4,8 or more cores at 2.66 is far better than 1 or 2 cores at 3GHz.
Erasmus
Aug 26, 06:45 PM
I vote Apple release a modified version of the Core 2 Duo Macbook Pro.
The only difference would be the words "Powerbook G5" under its screen, a change of the label on the box to "Dual 2.33 G5" and software that changes the actual name of the processor in System Processor to "IBM PowerPC G5 Dual 2.33".
This would make the IBM fanboys very happy, as they would think they had a G5 Powerbook, and therefore the wishes for "G5 Powerbooks next Tuesday" would hopefully stop.
Apple could sell them for five times the cost of a regular Macbook Pro, and get a healthy 20 grand profit off each sale for almost no effort on their part.
The only difference would be the words "Powerbook G5" under its screen, a change of the label on the box to "Dual 2.33 G5" and software that changes the actual name of the processor in System Processor to "IBM PowerPC G5 Dual 2.33".
This would make the IBM fanboys very happy, as they would think they had a G5 Powerbook, and therefore the wishes for "G5 Powerbooks next Tuesday" would hopefully stop.
Apple could sell them for five times the cost of a regular Macbook Pro, and get a healthy 20 grand profit off each sale for almost no effort on their part.
PBF
Apr 11, 11:05 PM
If they delay iPhone 5 until Fall/Winter, then they'd better release the white iPhone 4 some time in Spring as promised by Phil Schiller. :mad:
marksman
Mar 31, 04:57 PM
Only if you do not add products like the iPad and the iPod Touch. In other words, if you throw out 50% of the iOS products.
I would add I never understand the comparison of Smartphones running Android to smartphones running IOS.
Neither Google or Apple sell their phone operating systems, and the Android spectrum is made up of 50 handsets from 10 different manufacturers who are in direct competition with each other. They are not one big group working together to take on Apple. It makes absolutely zero sense to make that kind of comparison.
It is just as weird as loping off iPod and iPad IOS users...
If people want to compare smartphones, then compare actual sales of individual smartphones, each which only use one OS. People should not draw meaningless lines in the sand lumping all android based handsets together, because they are not together other than they run android. They might as well compare black phones to white phones.
I imagine if you made a chart of the top selling smartphones in the last 5 years, it would consist of the iPhone 4, the iPhone 3GS, the iPhone 3G and the iPhone.
Why not group smartphones by what kind of graphics chip they have or what type of memory chip they use? The OS is irrelevant. Nobody in the smartphone business is directly making money off any of these oses, it is a stupid way to categorize smart phones.
Of course it happens because if they didn't lump them together it would look absurd with Apple totally dominating the smart phone market with their latest phone every year while 100 android commodity phones all have tiny market shares just to get replaced by the next one.
How does HTC running android OS benefit or relate to a Motorola phone running android? It does not, at all.
I would add I never understand the comparison of Smartphones running Android to smartphones running IOS.
Neither Google or Apple sell their phone operating systems, and the Android spectrum is made up of 50 handsets from 10 different manufacturers who are in direct competition with each other. They are not one big group working together to take on Apple. It makes absolutely zero sense to make that kind of comparison.
It is just as weird as loping off iPod and iPad IOS users...
If people want to compare smartphones, then compare actual sales of individual smartphones, each which only use one OS. People should not draw meaningless lines in the sand lumping all android based handsets together, because they are not together other than they run android. They might as well compare black phones to white phones.
I imagine if you made a chart of the top selling smartphones in the last 5 years, it would consist of the iPhone 4, the iPhone 3GS, the iPhone 3G and the iPhone.
Why not group smartphones by what kind of graphics chip they have or what type of memory chip they use? The OS is irrelevant. Nobody in the smartphone business is directly making money off any of these oses, it is a stupid way to categorize smart phones.
Of course it happens because if they didn't lump them together it would look absurd with Apple totally dominating the smart phone market with their latest phone every year while 100 android commodity phones all have tiny market shares just to get replaced by the next one.
How does HTC running android OS benefit or relate to a Motorola phone running android? It does not, at all.
brianus
Sep 15, 12:26 PM
No, that is not true, in fact it couldn't be more untrue. Now, the 95 family (95/98/ME) was a totally different codebase. But with the NT family (NT/2000/XP) the client and the server were identical, even identical in distributed code. In fact there was a big scandal years ago where someone discovered the registry setting where you could turn NT Workstation into NT Server. Back then all that was different was the number of outbound IP connections and possibly the number of CPUs supported. All they were trying to do with Workstation was prevent you from using it as a server (thus the outbound IP limit) and at some point they didn't give you full-blown IIS on Workstation. That's it.
Dude, how many times do I have to repeat myself before you myopic '90s-era IT geeks understand me? I was referring to the difference between Windows 9x and Windows NT. I neither knew, nor care, that there were different versions of NT itself. For. Christ's. Sake. I have said this three times now. Don't make me come over there.
On an unrelated note, wouldnt it been cool to effectivly install a whole OS on RAM. That would be noticably quicker....
I keep hearing about speculation that they'll start using NAND flash to help with startup times in laptops, things like that -- now, how would that work? Doesn't everything have to be on the boot volume? OS's seem to assume these days that the OS, programs and user directories are all going to be on one volume and you have to be kind of technically literate to do it differently..
Dude, how many times do I have to repeat myself before you myopic '90s-era IT geeks understand me? I was referring to the difference between Windows 9x and Windows NT. I neither knew, nor care, that there were different versions of NT itself. For. Christ's. Sake. I have said this three times now. Don't make me come over there.
On an unrelated note, wouldnt it been cool to effectivly install a whole OS on RAM. That would be noticably quicker....
I keep hearing about speculation that they'll start using NAND flash to help with startup times in laptops, things like that -- now, how would that work? Doesn't everything have to be on the boot volume? OS's seem to assume these days that the OS, programs and user directories are all going to be on one volume and you have to be kind of technically literate to do it differently..
CaoCao
Feb 28, 08:40 PM
Huh?
Wouldn't it also, then, be like the same way that heterosexuality causes attraction to the opposite sex?
No because heterosexuality is the default way the brain works
Wouldn't it also, then, be like the same way that heterosexuality causes attraction to the opposite sex?
No because heterosexuality is the default way the brain works
Eidorian
Jul 20, 09:43 AM
There are serious electrical and physical problems with jacking up clock speeds much further than they are now. Intel managed to push their chips to 3.8GHz, but the power consumed was tremendous.Fixed
generik
Sep 19, 01:15 AM
Haha, sounds like other people's disappointment amuses you. Feeding the fires of anticipation there... I can play along.
Any likelihood that we will see a laptop (NOT notebook) that can actually be used in one's lap without suffering from burns?!
Well it is not "other people's" disappointment, I know for a fact that if the nice HDD bay didn't make it into the next speedbump I'd be royally pissed. But yeah, somehow I have a really bad feeling that it is just going to be a chip swop.
Any likelihood that we will see a laptop (NOT notebook) that can actually be used in one's lap without suffering from burns?!
Well it is not "other people's" disappointment, I know for a fact that if the nice HDD bay didn't make it into the next speedbump I'd be royally pissed. But yeah, somehow I have a really bad feeling that it is just going to be a chip swop.
No comments:
Post a Comment