rxse7en
Oct 10, 08:13 AM
Morning all,
Two things. Guesstimates on release of quad-core Mac Pros (time to upgrade here). And MultiMedia, how do you like the Dell 24" LCDs?
B
Two things. Guesstimates on release of quad-core Mac Pros (time to upgrade here). And MultiMedia, how do you like the Dell 24" LCDs?
B
neilp4453
Feb 21, 03:16 PM
It's a bit rich calling people delusional and then coming out with with wish list statements as if they're bound in volumes of 'The Future History of Smartphones vol ll'
The Android market has potential, but only for as long as lazy phone manufacturers, who have never learned how to do operating systems and software, are happy to grab a freebie. This situation is the same as you or me going to a fair and picking up a free dev copy of some new software... and then running a business off its capabilities. No license fee! That's the attraction.
The saved costs derived from having much lower in-house dev costs and shorter route to market make Android a gift. But not without major issues. CylonGlitch [above] makes this very valid point:
"... as many as 40 models of Android devices will ship, . . . "
"How the heck is a developer supposed to support that many different devices? Even if there were 5 different screen resolutions, it would be hard to optimize your app for each. Now different RAM configurations, different CPU's, different everything, OUCH."
It's a ludicrous state of affairs. A wet dream for the armchair geek maybe, but for the non geek buyer, the proposition is entirely different. It already gives me a headache just thinking about it.
With the iPhone, Apple have demonstrated one of the oldest marketing principles still holds true in the 21st Century. If you give people three models to choose from with two colour options, you make the proposition simpler.
But all other manufacturers are still depending on the old marketing model of offering a bewildering array of models to try and catch the entire market. Now, that model has failed already - because it doesn't work. The market is automatically diluted. So why are they still using it?
speedriff [also above] has decided Steve Jobs is a "douche" because he's being "hardheaded" over Flash, while "Other manufacturers are giving AMOLED screens and are getting better and better."
Apple make more profit from all their products than anyone else. One way they do this is by waiting until they can demand a very high proportion of a large enough production of a component [NAND flash memory, screens etc] at the most competitive price, or can manufacture in-house [CPUs]. That's not just good business, it's vital for long term survival.
Wait until June this year and we'll see the new iPhone with a longer [HD aspect ratio] OLED screen. And HTML5 is the future. in reality, Adobe are better candidates for the 'douche' epithet here. If Flash had fewer issues, maybe Apple would add it.
What you need to understand is that Apple is better at seeing, predicting and exploiting the WHOLE picture, than any other company in this game. And anyone who seriously thinks a disparate group of not for profit developers and a market full of lazy manufacturers with a 19th Century sales mentality are going to win this one, is simply not even looking at it properly.
You really think so? I don't think Apple has done anything exceptional. They built off of their popular iPod brand. Any company could do the same..unfortunately not every company has something as popular as iPod. Apple's entre into the smartphone market was guaranteed from the start.
In your post, all I see is you ranting about the superiority of Apple while downplaying potential competition by just overlooking what they have done thus far. In our case, competition is healthy because if it were up to people like you, we would have to accept an iPhone 4g with the same specs as an iPhone 3GS. Yes, I am greatly overexaggerating but I hope you see my point.
Apple will do very little unless they are pressured to do a lot. I guess you missed my point where I said Apple does this on a regular basis with all of their items. The last to implement anything new is not something they do because they are an epithet of marketing. They do it because they can.
The Android market has potential, but only for as long as lazy phone manufacturers, who have never learned how to do operating systems and software, are happy to grab a freebie. This situation is the same as you or me going to a fair and picking up a free dev copy of some new software... and then running a business off its capabilities. No license fee! That's the attraction.
The saved costs derived from having much lower in-house dev costs and shorter route to market make Android a gift. But not without major issues. CylonGlitch [above] makes this very valid point:
"... as many as 40 models of Android devices will ship, . . . "
"How the heck is a developer supposed to support that many different devices? Even if there were 5 different screen resolutions, it would be hard to optimize your app for each. Now different RAM configurations, different CPU's, different everything, OUCH."
It's a ludicrous state of affairs. A wet dream for the armchair geek maybe, but for the non geek buyer, the proposition is entirely different. It already gives me a headache just thinking about it.
With the iPhone, Apple have demonstrated one of the oldest marketing principles still holds true in the 21st Century. If you give people three models to choose from with two colour options, you make the proposition simpler.
But all other manufacturers are still depending on the old marketing model of offering a bewildering array of models to try and catch the entire market. Now, that model has failed already - because it doesn't work. The market is automatically diluted. So why are they still using it?
speedriff [also above] has decided Steve Jobs is a "douche" because he's being "hardheaded" over Flash, while "Other manufacturers are giving AMOLED screens and are getting better and better."
Apple make more profit from all their products than anyone else. One way they do this is by waiting until they can demand a very high proportion of a large enough production of a component [NAND flash memory, screens etc] at the most competitive price, or can manufacture in-house [CPUs]. That's not just good business, it's vital for long term survival.
Wait until June this year and we'll see the new iPhone with a longer [HD aspect ratio] OLED screen. And HTML5 is the future. in reality, Adobe are better candidates for the 'douche' epithet here. If Flash had fewer issues, maybe Apple would add it.
What you need to understand is that Apple is better at seeing, predicting and exploiting the WHOLE picture, than any other company in this game. And anyone who seriously thinks a disparate group of not for profit developers and a market full of lazy manufacturers with a 19th Century sales mentality are going to win this one, is simply not even looking at it properly.
You really think so? I don't think Apple has done anything exceptional. They built off of their popular iPod brand. Any company could do the same..unfortunately not every company has something as popular as iPod. Apple's entre into the smartphone market was guaranteed from the start.
In your post, all I see is you ranting about the superiority of Apple while downplaying potential competition by just overlooking what they have done thus far. In our case, competition is healthy because if it were up to people like you, we would have to accept an iPhone 4g with the same specs as an iPhone 3GS. Yes, I am greatly overexaggerating but I hope you see my point.
Apple will do very little unless they are pressured to do a lot. I guess you missed my point where I said Apple does this on a regular basis with all of their items. The last to implement anything new is not something they do because they are an epithet of marketing. They do it because they can.
JackAxe
Sep 26, 08:04 PM
Bernard was going to be my 2nd guess. :rolleyes:
I'm Platinum Member, it's seems with all the upgrades it's cheaper in the end. I'm going to have to slow down and take a look at 8.
*LOL* :D
If 3D were my primary income, I would invest in their maintance plan, but as is, that's money I need for other upgrades. Maybe in the future.
Hope all goes well with 8. I won't be there for probably a year.
<]=)
Apple should put much needed development into the notebooks. The current crop of Mac Pros are perfect.
Let software catch up!
Speak for yourself. ;)
I certainly hope Apple is working on a pen book for this fall. A version with an upgraded Wacom digitizer that at least supports tilt. Preferably a version with full Intuous 3 specs. The PC versions are all stuck in the Artz II days.
<]=)
I'm Platinum Member, it's seems with all the upgrades it's cheaper in the end. I'm going to have to slow down and take a look at 8.
*LOL* :D
If 3D were my primary income, I would invest in their maintance plan, but as is, that's money I need for other upgrades. Maybe in the future.
Hope all goes well with 8. I won't be there for probably a year.
<]=)
Apple should put much needed development into the notebooks. The current crop of Mac Pros are perfect.
Let software catch up!
Speak for yourself. ;)
I certainly hope Apple is working on a pen book for this fall. A version with an upgraded Wacom digitizer that at least supports tilt. Preferably a version with full Intuous 3 specs. The PC versions are all stuck in the Artz II days.
<]=)
bigandy
Mar 20, 09:08 AM
anyone got a link to Mac PyMusique downloads or is it Windows only?
from what i see on it's website tis a *nix programme... ie not windows.. ;)
from what i see on it's website tis a *nix programme... ie not windows.. ;)
emotion
Sep 20, 08:50 AM
I have one of these devices, it's excellent. Especially with the user community at http://toppy.org.uk/.
There's some good info on using one with a Mac here http://www.mtop.co.uk/intro.html
The stock EPG on the unit is a bit crufty but it's deffinetly improving. I'd recommend one to anyone looking for a decent PVR.
I'm glad I piped up about this now, thanks for that info tyr2.
There's some good info on using one with a Mac here http://www.mtop.co.uk/intro.html
The stock EPG on the unit is a bit crufty but it's deffinetly improving. I'd recommend one to anyone looking for a decent PVR.
I'm glad I piped up about this now, thanks for that info tyr2.
sinsin07
Apr 9, 09:28 AM
If you don't believe me, there's plenty of history to read. Just go look at the following industries that were disrupted by technology...
prince william kate middleton
prince william kate middleton
prince william kate middleton
prince william fridge.
prince william kate middleton
prince william kate middleton
Prince William Kate Middleton
prince william kate middleton
of Prince William and Kate
Prince William Kate Middleton
GE prince william and kate
Kate Middleton Wedding Watch
prince william kate middleton
jegbook
Apr 12, 04:06 PM
The delete thing bothers me a bit. What do you mean you can't move up? You mean with backspace? There is a preference in finder to show entire path so I never have trouble navigating up folder structure. If you are used to Vista and leaning toward 7, perhaps OS X isn't for you.
It's really not about how I delete things, nor is it about the pretty colors. It's about how much of my time I have to spend futzing with stuff like broken drivers, missing printers, yada yada yada.
I will admit I wasted a few hours this week chasing a Time Machine issue but that's about all the futzing I've had to do since about November. I'm willing to deal with the limitations and quirks of OS X because OS X doesn't waste my time. And it wasn't something I had to do in order to send my taxes or print out show tickets. I did it when I felt like I had the time, unlike so many windows problems that crop up on the way to an important meeting. I haven't seen an "are you sure" on my Mac since I got it. To me sometimes it seems like Windows was written to harvest clicks while OS X was written to avoid unnecessary user intervention.
Sure there are some quirks. Like the way copied folders are replaced, not merged with destination folders. Like the missing "cut" and "delete" features. But for me these quirks are no big deal and I look forward to sitting down in front of my Mac after suffering with 7 all day at work. But what we say in this thread isn't necessarily relevant to your situation. Based on what we have described, you can get a sense as to how "different" OS X is. To me, it's really not that much different. What is more important is how different it is to you and whether it bothers you.
Your comment about "suffering with 7 all day" is surprising to me. I don't know if I've seen Windows 7 experience a full OS crash. And I've been toying with Win 7 since it was in beta.
Sure, it ain't perfect, but I find Win 7 pretty darn efficient overall. I haven't encountered any OS related issues with 7 yet. Application quirks, sure, but not really any OS problems.
I'd say OS X and Win 7 are much more comparable than Vista or XP.
Again, it comes down mostly to what you need a computer to do.
Cheers, all.
It's really not about how I delete things, nor is it about the pretty colors. It's about how much of my time I have to spend futzing with stuff like broken drivers, missing printers, yada yada yada.
I will admit I wasted a few hours this week chasing a Time Machine issue but that's about all the futzing I've had to do since about November. I'm willing to deal with the limitations and quirks of OS X because OS X doesn't waste my time. And it wasn't something I had to do in order to send my taxes or print out show tickets. I did it when I felt like I had the time, unlike so many windows problems that crop up on the way to an important meeting. I haven't seen an "are you sure" on my Mac since I got it. To me sometimes it seems like Windows was written to harvest clicks while OS X was written to avoid unnecessary user intervention.
Sure there are some quirks. Like the way copied folders are replaced, not merged with destination folders. Like the missing "cut" and "delete" features. But for me these quirks are no big deal and I look forward to sitting down in front of my Mac after suffering with 7 all day at work. But what we say in this thread isn't necessarily relevant to your situation. Based on what we have described, you can get a sense as to how "different" OS X is. To me, it's really not that much different. What is more important is how different it is to you and whether it bothers you.
Your comment about "suffering with 7 all day" is surprising to me. I don't know if I've seen Windows 7 experience a full OS crash. And I've been toying with Win 7 since it was in beta.
Sure, it ain't perfect, but I find Win 7 pretty darn efficient overall. I haven't encountered any OS related issues with 7 yet. Application quirks, sure, but not really any OS problems.
I'd say OS X and Win 7 are much more comparable than Vista or XP.
Again, it comes down mostly to what you need a computer to do.
Cheers, all.
iJohnHenry
Mar 26, 03:16 PM
Confucius say: Foolish is man who questions skunk in ancient tongues.
fifthworld
Mar 18, 08:40 AM
I believe nobody is abusing the system; instead, it's the system -unlimited, 2GB, 4Gb, whatever- that is unable to cope with the different needs. As AT&T can monitor the usage of the databand, just give us a plan where we pay based in usage, for example $5 for each block of 1GB, and be done with it!
lifeinhd
Apr 12, 10:21 PM
This is what iMovie after iMovie '06 should have been, if only because it has a PROPER FRICKIN' TIMELINE!
Was really hoping for $199, but $299 isn't bad. I might just upgrade from iMovie '06 (I'm not really a 'pro' editor, but I love my timelines!).
Was really hoping for $199, but $299 isn't bad. I might just upgrade from iMovie '06 (I'm not really a 'pro' editor, but I love my timelines!).
Peace
Sep 12, 05:54 PM
To help quell confusion this device WILL be 802.11n
There will be no problem streaming DVD quality or even 720P
There will be no problem streaming DVD quality or even 720P
Liquorpuki
Mar 13, 09:56 PM
They were talking talking about a 100 square mile solar plant. Take this PopSci link (http://www.popsci.com/environment/article/2009-06/solar-power) for example. A 20 acre site produces 5 Megawatts. One square mile (640 acres) would provide 160 Megawatts. Ten square miles would provide 16000 Megawatts (16 Gigawatts). The link says the country will need 20 Gigawats by 2050. The worst possible accident in this case does not result in thousands of square miles being permanently (as far as this generation is concerned) contaminated.
In contrast Japan Disaster May Set Back Nuclear Power Industry (http://www.usatoday.com/news/world/2011-03-14-quakenuclear14_ST_N.htm). As far as I know, solar farms don't "melt down" at least not in a way that might effect the entire population of a U.S. state. I understand the nuclear reactors are built to hold in the radiation when things go wrong, but what if they don't and what a mess afterwards.
You need to separate capacity from demand. Capacity is just the maximum power a station can theoretically produce. In practice, most of these renewable stations never reach that max. I've checked the stats at my utility's wind farm and that thing is usually around 9% of capacity. Considering a wind farm costs 4 times as much money as a natural gas generator to build for the same capacity, efficiency-wise, the station is a joke.
What's more important is demand - being able to produce enough energy when we need it. This is where solar and wind fall short. They don't generate when we want them to, they only generate when mother nature wants them to. It would be fine if grid energy storage (IE batteries) technology was developed enough to be able to store enough energy to power a service area through an entire winter (in the case of solar). But last I checked, current grid energy storage batteries can only store a charge for 8-12 hours before they start losing charge on their own. They're also the size of buildings, fail after 10 years, and cost a ton of money.
This is why a lot of utilities have gone to nuclear to replace coal and why here in the US, we still rely on coal to provide roughly 50% of our electricity and most of our base load. There are few options.
In contrast Japan Disaster May Set Back Nuclear Power Industry (http://www.usatoday.com/news/world/2011-03-14-quakenuclear14_ST_N.htm). As far as I know, solar farms don't "melt down" at least not in a way that might effect the entire population of a U.S. state. I understand the nuclear reactors are built to hold in the radiation when things go wrong, but what if they don't and what a mess afterwards.
You need to separate capacity from demand. Capacity is just the maximum power a station can theoretically produce. In practice, most of these renewable stations never reach that max. I've checked the stats at my utility's wind farm and that thing is usually around 9% of capacity. Considering a wind farm costs 4 times as much money as a natural gas generator to build for the same capacity, efficiency-wise, the station is a joke.
What's more important is demand - being able to produce enough energy when we need it. This is where solar and wind fall short. They don't generate when we want them to, they only generate when mother nature wants them to. It would be fine if grid energy storage (IE batteries) technology was developed enough to be able to store enough energy to power a service area through an entire winter (in the case of solar). But last I checked, current grid energy storage batteries can only store a charge for 8-12 hours before they start losing charge on their own. They're also the size of buildings, fail after 10 years, and cost a ton of money.
This is why a lot of utilities have gone to nuclear to replace coal and why here in the US, we still rely on coal to provide roughly 50% of our electricity and most of our base load. There are few options.
Icaras
Apr 9, 12:43 AM
That's a complete joke, surely? There's no way you can compare console gaming, in basically a home arcade, to swiping your fingers around on a 3.5" screen. No way. I am a gamer, and always will be.
Gaming on the iPhone is good for 2-minute bursts, such as when sitting on the toilet. It's not a great games device. Most of the games are cheap with no replay value.
Say that about games like Final Fantasy III, Aralon, or even NOVA 2. Try finishing any of these games while on one sitting at the toilet. :eek:
You're right about prematurely comparing iOS to console gaming though. However, I feel iOS absolutely competes with handheld devices by Sony and Nintendo.
I feel the quality is there for many games and growing. I think it would be foolish to dismiss gaming on iOS when there is obvious growth and a healthy consumer market happening at the App Store.
Gaming on the iPhone is good for 2-minute bursts, such as when sitting on the toilet. It's not a great games device. Most of the games are cheap with no replay value.
Say that about games like Final Fantasy III, Aralon, or even NOVA 2. Try finishing any of these games while on one sitting at the toilet. :eek:
You're right about prematurely comparing iOS to console gaming though. However, I feel iOS absolutely competes with handheld devices by Sony and Nintendo.
I feel the quality is there for many games and growing. I think it would be foolish to dismiss gaming on iOS when there is obvious growth and a healthy consumer market happening at the App Store.
Rodimus Prime
Oct 7, 02:18 PM
Valid points, except you're looking at a micro-niche of power-users, while the iPhone's massive growth comes from a much broader market than that. Android will (and does) take some power-user market share, and I look forward to seeing where it goes.
The big thing though is DEVELOPER share. Apps. Android will run--in different flavors--on a number of different phones, offering choice in screen size, features, hard vs. virtual keys, etc. That sounds great--but will the same APP run on all those flavors? No. The app market will be fragmented among incompatible models. There's no good way out of that--it's one advantage Apple's model will hang on to.
yet all the one advantage the apple model has it killed by the fact that how difficult it is to get an app approved and no way to directly sell it to the consumer.
That is what going to hurt apple in the good devs leaving. The best devs are starting to get fed up with apple system and looking elsewhere.
The big thing though is DEVELOPER share. Apps. Android will run--in different flavors--on a number of different phones, offering choice in screen size, features, hard vs. virtual keys, etc. That sounds great--but will the same APP run on all those flavors? No. The app market will be fragmented among incompatible models. There's no good way out of that--it's one advantage Apple's model will hang on to.
yet all the one advantage the apple model has it killed by the fact that how difficult it is to get an app approved and no way to directly sell it to the consumer.
That is what going to hurt apple in the good devs leaving. The best devs are starting to get fed up with apple system and looking elsewhere.
skunk
Mar 11, 03:55 PM
2149: The Kyodo news agency is now citing a safety panel as saying that the radiation level inside one of the reactors at the Fukushima-Daiichi nuclear plant is 1,000 times higher than normal.
http://www.bbc.co.uk/news/world-middle-east-12307698
Looking hairier by the minute. :eek:
http://www.bbc.co.uk/news/world-middle-east-12307698
Looking hairier by the minute. :eek:
aegisdesign
Oct 26, 05:11 AM
JUST IMAGINE A COMPUTER IN WHICH EACH PIXEL IS CONTROLLED BY A SINGLE PROCESSOR.
I've used one. Back in the 1980s, beginning of the 90s. The low end model had 1024 processors and the high end model 4096 processors. It was a pig to program. When drawing on the screen you split the task at hand up into many parallel threads each drawing a part of the screen. Not quite 1 CPU per pixel but you get the idea.
I've used one. Back in the 1980s, beginning of the 90s. The low end model had 1024 processors and the high end model 4096 processors. It was a pig to program. When drawing on the screen you split the task at hand up into many parallel threads each drawing a part of the screen. Not quite 1 CPU per pixel but you get the idea.
richard.mac
Mar 11, 01:54 AM
crap! :( thoughts to the Japanese living there. earth is fierce atm! disastrous earthquakes in cities like there and in New Zealand and that flooding in Australia.
God of Biscuits
Mar 23, 05:21 PM
Probably, unless Apple recognizes the competition and responds by:
- SDK that can execute on other platforms like Windows or Linux and that uses a more user-friendly and intuitive language than Objective-C
You clearly have no idea what you're talking about.
What you really mean is something more popular. And that's certainly NOT the same as "more user friendly" or "more intuitive".
Are you even an Objective C programmer?
At any rate, what you *are* is the bazillionth person who's said that the key to Apple's success in the future is to do what everyone else is doing.
Riiiiiiight.
- SDK that can execute on other platforms like Windows or Linux and that uses a more user-friendly and intuitive language than Objective-C
You clearly have no idea what you're talking about.
What you really mean is something more popular. And that's certainly NOT the same as "more user friendly" or "more intuitive".
Are you even an Objective C programmer?
At any rate, what you *are* is the bazillionth person who's said that the key to Apple's success in the future is to do what everyone else is doing.
Riiiiiiight.
CalBoy
Apr 22, 11:17 PM
Listen Bill Nye, I wasn't making a conclusive observation on the history of the earth, universe, or life forms. I was posing a question that most people (for the sake of simplicity, not illiteracy) relate to with a single word, "bang." If I need an expert opinion for my next astronomy class, I'll give you ring.
The whole point is that it is not a single "bang." You're trying to conflate how most people view their god with how people conceptualize science. They simply aren't the same.
It's easy to relate to a single term for everything when that one thing, according to your beliefs, is the answer to everything. It's nearly impossible to do that when the answers to your questions are varied and specific.
Only the scientifically illiterate relate "bang" to "origin of life."
The whole point is that it is not a single "bang." You're trying to conflate how most people view their god with how people conceptualize science. They simply aren't the same.
It's easy to relate to a single term for everything when that one thing, according to your beliefs, is the answer to everything. It's nearly impossible to do that when the answers to your questions are varied and specific.
Only the scientifically illiterate relate "bang" to "origin of life."
maclaptop
Apr 26, 07:47 AM
It's about power and control- nothing more.
Think Obama & Jobs the supreme power couple :)
Think Obama & Jobs the supreme power couple :)
MacCoaster
Oct 12, 04:19 PM
javajedi: Well, well... I finally figured out GNUstep and ported your Cocoa program to it--works 100%. Funny thing it's slower than the Java one, but it might be the extra crap I put in there (menus, etc.). 10 seconds compared to 7 seconds with Java. But that's still faster than 70 seconds on a G4. I'll be making a pure C port if anyone hasn't.
Cabbit
Apr 15, 12:43 PM
Just to note there is gay behaviour in the animal kingdom, my two male cats went at it in there puberty and it is well documented in other animals. It is perfectly natural and before the time of the christian gods creation gay behaviour was tolerated in Rome though lesbian behaviour was not.
And marriage is legal in many parts of Europe between same sex couples, it is only the 3rd world and developing world that has the biggest issue with same sex marriage but as these countries always traditionally follow Europe expect the decline of religion as more and more people become educated, and with the decline of religion such nonsense as hating each over whom we love to also fade away.
And marriage is legal in many parts of Europe between same sex couples, it is only the 3rd world and developing world that has the biggest issue with same sex marriage but as these countries always traditionally follow Europe expect the decline of religion as more and more people become educated, and with the decline of religion such nonsense as hating each over whom we love to also fade away.
AppliedVisual
Oct 26, 10:34 AM
Considering that Windows supports up to 64 CPU cores, and that 64 core Windows machines are available - it would be nice if you could show some proof that OSX on a 64 CPU machine scales better than Windows or Linux....
Are you being overly pedantic or do you just want to argue? I said WinXP. -- "probably as good or better than WinXP". WinXP only supports two CPUs with a max of 4 cores each right now as per the EULA. The Windows kernel itself actually handles CPU division and scales dynamically based on addressable CPUs within a system all the way up to 256 CPUs or cores, with support for up to 4 logical or virtual CPUs each. And just think where those 64-CPU Windows systems are going to be in the near future as they're updraded with quad-core CPUs from AMD/Intel...
BTW: You have to buy Windows Server Datacenter Edition to get to all those CPUs.
Are you being overly pedantic or do you just want to argue? I said WinXP. -- "probably as good or better than WinXP". WinXP only supports two CPUs with a max of 4 cores each right now as per the EULA. The Windows kernel itself actually handles CPU division and scales dynamically based on addressable CPUs within a system all the way up to 256 CPUs or cores, with support for up to 4 logical or virtual CPUs each. And just think where those 64-CPU Windows systems are going to be in the near future as they're updraded with quad-core CPUs from AMD/Intel...
BTW: You have to buy Windows Server Datacenter Edition to get to all those CPUs.
jettredmont
May 3, 03:44 PM
Of course, I don't know of any Linux distribution that doesn't require root to install system wide software either. Kind of negates your point there...
I wasn't specific enough there. I was talking about how "Unix security" has been applied to the overall OS X permissions system, not just "Unix security" in the abstract. I'll cede the point that this does mean that "Unix security" in the abstract is no better than NT security, as I can not refute the claim that Linux distributions share the same problem (the need to run as "root" to do day-to-day computer administration). I would point out, though, that unless things have changed significantly, most window managers for Linux et al refuse to run as root, so you can't end up with a full-fledged graphical environment running as root.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
Yes and no. You are looking at "Unix security" as a set of controls. I'm looking at it as a pragmatic system. As a system, Apple's OS X model allowed users to run as standard users and non-root Administrators while XP's model made non-Administrator access incredibly cumbersome.
You can blame that on Windows developers just being dumber, or you can blame it on Microsoft not sufficiently cracking the whip, or you can blame it on Microsoft not making the "right way" easy enough. Wherever the blame goes, the practical effect is that Windows users tended to run as Administrator and locking them down to Standard user accounts was a slap in the face and serious drain on productivity.
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
Interesting. I do remember being able to do some pretty damaging things with Administrator access in Windows XP such as replacing shared DLLs, formatting the hard drive, replacing any executable in c:\windows, etc, which OS X would not let me do without typing in a password (GUI) or sudo'ing to root (command line).
But, I stand corrected. NT "Administrator" is not equivalent to "root" on Unix. But it's a whole lot more "trusted" (and hence all apps it runs are a lot more trusted) than the equivalent OS X "Administrator" account.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
Again, the components are all there, but while the pragmatic effect was that a user needed to right-click, select "Run as Administrator", then type in their password to run something ... well, that wasn't going to happen. Hence, users tended to have Administrator access accounts.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
Sorry! I know; it burns!
...
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
Well, unless you have more information on this than I do, I'm assuming that the .zip file was unarchived (into a sub-folder of ~/Downloads), a .dmg file with an "Internet Enabled" flag was found inside, then the user was prompted by the OS if they wanted to run this installer they downloaded, then the installer came up (keeping in mind that "installer" is a package structure potentially with some scripts, not a free-form executable, and that the only reason it came up was that the 'installer' app the OS has opened it up and recognized it). I believe the Installer also asks the user permission before running any of the preflight scripts.
Unless there is a bug here exposing a security hole, this could not be done without multiple user interactions. The "installer" only ran because it was a set of instructions for the built-in installer. The disk image was only opened because it was in the form Safari recognizes as an auto-open disk image. The first time "arbitrary code" could be run would be in the preflight script of the installer.
I wasn't specific enough there. I was talking about how "Unix security" has been applied to the overall OS X permissions system, not just "Unix security" in the abstract. I'll cede the point that this does mean that "Unix security" in the abstract is no better than NT security, as I can not refute the claim that Linux distributions share the same problem (the need to run as "root" to do day-to-day computer administration). I would point out, though, that unless things have changed significantly, most window managers for Linux et al refuse to run as root, so you can't end up with a full-fledged graphical environment running as root.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
Yes and no. You are looking at "Unix security" as a set of controls. I'm looking at it as a pragmatic system. As a system, Apple's OS X model allowed users to run as standard users and non-root Administrators while XP's model made non-Administrator access incredibly cumbersome.
You can blame that on Windows developers just being dumber, or you can blame it on Microsoft not sufficiently cracking the whip, or you can blame it on Microsoft not making the "right way" easy enough. Wherever the blame goes, the practical effect is that Windows users tended to run as Administrator and locking them down to Standard user accounts was a slap in the face and serious drain on productivity.
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
Interesting. I do remember being able to do some pretty damaging things with Administrator access in Windows XP such as replacing shared DLLs, formatting the hard drive, replacing any executable in c:\windows, etc, which OS X would not let me do without typing in a password (GUI) or sudo'ing to root (command line).
But, I stand corrected. NT "Administrator" is not equivalent to "root" on Unix. But it's a whole lot more "trusted" (and hence all apps it runs are a lot more trusted) than the equivalent OS X "Administrator" account.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
Again, the components are all there, but while the pragmatic effect was that a user needed to right-click, select "Run as Administrator", then type in their password to run something ... well, that wasn't going to happen. Hence, users tended to have Administrator access accounts.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
Sorry! I know; it burns!
...
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
Well, unless you have more information on this than I do, I'm assuming that the .zip file was unarchived (into a sub-folder of ~/Downloads), a .dmg file with an "Internet Enabled" flag was found inside, then the user was prompted by the OS if they wanted to run this installer they downloaded, then the installer came up (keeping in mind that "installer" is a package structure potentially with some scripts, not a free-form executable, and that the only reason it came up was that the 'installer' app the OS has opened it up and recognized it). I believe the Installer also asks the user permission before running any of the preflight scripts.
Unless there is a bug here exposing a security hole, this could not be done without multiple user interactions. The "installer" only ran because it was a set of instructions for the built-in installer. The disk image was only opened because it was in the form Safari recognizes as an auto-open disk image. The first time "arbitrary code" could be run would be in the preflight script of the installer.