Benjamins
Apr 8, 11:17 PM
Velly Intelrsting. Did they start out making games from rocks?
they started off making card games.
they started off making card games.
peharri
Sep 24, 05:08 PM
The iTV most definitely requires a computer.
There's no evidence of this. Nothing has been said suggesting anything of the sort.
The iTV is a like a suped up Airport extreme for video.
No, it isn't. It's not remotely like an Airport Extreme.
It has already been demoed and it requires a computer. The computer streams the iTunes content to the iTV and the iTV receives the stream and translates it into video and audio out via an HDMI or SVGA connection to your TV.
This is not the case. There's only been one demonstration so far, and the controlling part was the iTV, not the server.
The iTV also supports front row and allows remote control of the iTunes source machine.
What was demonstrated was a box that can view iTunes libraries on the local network. There's no evidence it "controls" the source machine beyond telling it to send a stream (like any iTunes client.)
There maybe more features in the future but those are the reported and demoed features.
The reported and demo'd features are of a standalone box that can access iTunes libraries. The box is reported to have storage (which is what this entire thread is about!)
It most certainly is not of some souped up Airport Extreme. That was what was widely rumoured before the Showtime presentation, and it turned out to be completely false. Whatever the debate of the precise capabilities of the iTV may be, the device demo'd couldn't be further from being an Airport Extreme if it tried.
There's no evidence of this. Nothing has been said suggesting anything of the sort.
The iTV is a like a suped up Airport extreme for video.
No, it isn't. It's not remotely like an Airport Extreme.
It has already been demoed and it requires a computer. The computer streams the iTunes content to the iTV and the iTV receives the stream and translates it into video and audio out via an HDMI or SVGA connection to your TV.
This is not the case. There's only been one demonstration so far, and the controlling part was the iTV, not the server.
The iTV also supports front row and allows remote control of the iTunes source machine.
What was demonstrated was a box that can view iTunes libraries on the local network. There's no evidence it "controls" the source machine beyond telling it to send a stream (like any iTunes client.)
There maybe more features in the future but those are the reported and demoed features.
The reported and demo'd features are of a standalone box that can access iTunes libraries. The box is reported to have storage (which is what this entire thread is about!)
It most certainly is not of some souped up Airport Extreme. That was what was widely rumoured before the Showtime presentation, and it turned out to be completely false. Whatever the debate of the precise capabilities of the iTV may be, the device demo'd couldn't be further from being an Airport Extreme if it tried.
ATD
Sep 26, 12:51 PM
I bet I could peg all 8 cores doing a 3D render...easily.
Bring them I say. This may make me hold off on my render farm idea.
-mark
I'm already doing 6 cpu renders. Why stop at 8, I'll take 16 :D
Bring them I say. This may make me hold off on my render farm idea.
-mark
I'm already doing 6 cpu renders. Why stop at 8, I'll take 16 :D
MorphingDragon
May 2, 10:02 AM
This is exactly the kind of ignorance I'm referring to. The vast majority of users don't differentiate between "virus", "trojan", "phishing e-mail", or any other terminology when they are actually referring to malware as "anything I don't want on my machine." By continuously bringing up inane points like the above, not only are you not helping the situation, you're perpetuating a useless mentality in order to prove your mastery of vocabulary.
Congratulations.
Stupid people will be venerable to malicious intent no matter what the form or operating system. I find *nix has no viruses tune wholly justified until reality differs.
Congratulations.
Stupid people will be venerable to malicious intent no matter what the form or operating system. I find *nix has no viruses tune wholly justified until reality differs.
bedifferent
May 2, 12:22 PM
Except antivirus doesn't usually catch things like this, neither does anti-spyware since it acts like a legit program.
I fix windows machines and servers for a living an unfortunately a majority of my week is spent removing said malware from windows machines.
Agreed. I charge about $125-150/hour working on Windows systems. Initially issues weren't virus/malware related, but I always do a full system scan and find at least a dozen or so on the majority of them. Whether it's PEBKAC (Problem Exists Between Keyboard And Chair) errors, or viruses and malware (most do not update their anti-virus data and it's increasingly difficult to catch new viruses as so many new ones appear), I make most of my money working part-time in Communications and IT on Windows systems.
People complain about the bill that they could have purchased a new machine to which I iterate if it's a Window based system they will still have these issues.
However, I do not like this news one bit. It's not serious to us as were not the Joe the Mac user, but it's demonstrating that OS X isn't 100% secure (but much more difficult to crack).
No computer for which the user can write or install programs will ever be free of Malware (nor, to my knowledge, has the "malware free" term ever been applied to the Mac OS by anyone actually familiar with computer security). All I have to do is write a script that formats your hard drive, call it ReallyFunGame, thereby deceiving you into downloading it and running it, and poof.
Unlike Windows based .exe's, the user either has to open the dmg and drop the malware app in their App folder and run it or run the package installer. Unlike Windows the user needs to run it, and it is difficult to fully remove Windows malware/viruses as it propagates in the OS much more so than OS X (system registry, etc.). So in OS X the user has to engage the malware, in Windows much of it can be done without the user's knowledge.
As OS X is predominately a consumer product most hackers are focused on Windows based OS's that are traditionally businesses oriented. This is not to state that OS X is 100% secure, far from it, but currently it's the more secure consumer/business OS on the market.
I fix windows machines and servers for a living an unfortunately a majority of my week is spent removing said malware from windows machines.
Agreed. I charge about $125-150/hour working on Windows systems. Initially issues weren't virus/malware related, but I always do a full system scan and find at least a dozen or so on the majority of them. Whether it's PEBKAC (Problem Exists Between Keyboard And Chair) errors, or viruses and malware (most do not update their anti-virus data and it's increasingly difficult to catch new viruses as so many new ones appear), I make most of my money working part-time in Communications and IT on Windows systems.
People complain about the bill that they could have purchased a new machine to which I iterate if it's a Window based system they will still have these issues.
However, I do not like this news one bit. It's not serious to us as were not the Joe the Mac user, but it's demonstrating that OS X isn't 100% secure (but much more difficult to crack).
No computer for which the user can write or install programs will ever be free of Malware (nor, to my knowledge, has the "malware free" term ever been applied to the Mac OS by anyone actually familiar with computer security). All I have to do is write a script that formats your hard drive, call it ReallyFunGame, thereby deceiving you into downloading it and running it, and poof.
Unlike Windows based .exe's, the user either has to open the dmg and drop the malware app in their App folder and run it or run the package installer. Unlike Windows the user needs to run it, and it is difficult to fully remove Windows malware/viruses as it propagates in the OS much more so than OS X (system registry, etc.). So in OS X the user has to engage the malware, in Windows much of it can be done without the user's knowledge.
As OS X is predominately a consumer product most hackers are focused on Windows based OS's that are traditionally businesses oriented. This is not to state that OS X is 100% secure, far from it, but currently it's the more secure consumer/business OS on the market.
babyj
Sep 20, 12:11 PM
What do you do with your Xbox that would been relevant to watching videos on your TV?
Can you load Vids onto the Xbox HD and play them??
If you fit a mod chip to an Xbox it allows you to do lots of cool things, including upgrading the hard drive and run non certified software (eg homebrew, open source).
Probaby the most popular is Xbox Media Centre, which is what it says and does it pretty well. So for about £150 you end up with a decent media centre which isn't bad, plus its a game console as well.
Can you load Vids onto the Xbox HD and play them??
If you fit a mod chip to an Xbox it allows you to do lots of cool things, including upgrading the hard drive and run non certified software (eg homebrew, open source).
Probaby the most popular is Xbox Media Centre, which is what it says and does it pretty well. So for about £150 you end up with a decent media centre which isn't bad, plus its a game console as well.
IntelliUser
Apr 15, 10:04 AM
The transsexual kinda kills the whole message though. "Learn to accept yourself for who you are, except if you can't, then deform your body to look like someone else."
Homosexuality may not be a disease, but Gender Identity Disorder certainly is.
Homosexuality may not be a disease, but Gender Identity Disorder certainly is.
Pilgrim1099
Apr 10, 10:28 AM
You mean Microsoft, right? And the interesting part is, Gates is still alive.
Two problems with your pseudo-intellectual response.
1. Gates has retired from Microsoft. Who's running the show now?
2. Who is the sicker of the two? Jobs or Gates?
Two problems with your pseudo-intellectual response.
1. Gates has retired from Microsoft. Who's running the show now?
2. Who is the sicker of the two? Jobs or Gates?
sth
Apr 13, 04:20 AM
Some pro-style questions that have been left unanswered
Some of those questions actually were answered (for example that full keyboard control has been retained) and others are more or less no-brainers (like the stabilization question - you can enable/disable and even fine-tune that even in the dumbed-down iMovie, so why shouldn't you be able to do that in Final Cut).
Some of those questions actually were answered (for example that full keyboard control has been retained) and others are more or less no-brainers (like the stabilization question - you can enable/disable and even fine-tune that even in the dumbed-down iMovie, so why shouldn't you be able to do that in Final Cut).
ddtlm
Oct 7, 03:53 PM
Backtothemac:
Jesus you still don't get it. If you compare Apples to Apples, the 1.6GHZ Dual Athlon is still slower in apps that are multi processor aware. Now, how about the PIV? How does that stack up? The x86 is garbage. Any real IT director would know that.
No, I "get it" fine. Don't bother testing a 1.6ghz dual Athlon when 1.8ghz dual Athlons are readily available. It would do you good to note that this test did not cover all "apps that are multi processor aware", it covered only two apps that are multi-processor aware, and on one of them the Mac looses by a lot. Even on its one win, the dual 1.25 G4 would still loose to a top-of-the-line dual Athlon. Which is slower than a top-of-the-line dual Xeon. Get it?
Jesus you still don't get it. If you compare Apples to Apples, the 1.6GHZ Dual Athlon is still slower in apps that are multi processor aware. Now, how about the PIV? How does that stack up? The x86 is garbage. Any real IT director would know that.
No, I "get it" fine. Don't bother testing a 1.6ghz dual Athlon when 1.8ghz dual Athlons are readily available. It would do you good to note that this test did not cover all "apps that are multi processor aware", it covered only two apps that are multi-processor aware, and on one of them the Mac looses by a lot. Even on its one win, the dual 1.25 G4 would still loose to a top-of-the-line dual Athlon. Which is slower than a top-of-the-line dual Xeon. Get it?
faroZ06
May 2, 06:46 PM
Except this is not a virus. Some of you guys need a course on malware terminology. This is a trojan at best. Spyware at worst. Hardly a virus.
Exactly, everyone always talks about Macs being susceptible to viruses and viruses already existing for Macs, then they give the whole "market share" speech. I'm just sitting here virus and malware free laughing :p and most likely will be even if Apple gains market share. I'm a halftime Windows user, and I see soooo many security problems in it, but the MS fanboys blame market share!
I will say that market share DOES up the number of attacks on something, which is why Windows gets attacked so much, but it's also much much easier to attack than Mac OS.
Exactly, everyone always talks about Macs being susceptible to viruses and viruses already existing for Macs, then they give the whole "market share" speech. I'm just sitting here virus and malware free laughing :p and most likely will be even if Apple gains market share. I'm a halftime Windows user, and I see soooo many security problems in it, but the MS fanboys blame market share!
I will say that market share DOES up the number of attacks on something, which is why Windows gets attacked so much, but it's also much much easier to attack than Mac OS.
AidenShaw
Oct 7, 07:35 AM
As I've explained in detail above AV, the 2.33GHz Clovertowns are the most likely candidate as they cost Apple the same $851 as the 3GHz Woodies. So Apple can give customers a clear choice of fast 4 or slower 8 for the same +$800 total $3,300.
The slower Clovertowns also match the Woodie for TDP - you can get more power (for multi-threaded workflows) at the same power consumption (and heat production) with the quad.
The slower Clovertowns also match the Woodie for TDP - you can get more power (for multi-threaded workflows) at the same power consumption (and heat production) with the quad.
GGJstudios
Apr 13, 03:16 PM
I'm sure this has been mentioned.
Connecting other hard drives. I'm only able to read from most (windows) drives.
FAT32 (File Allocation Table)
I#39;ll love you Forever and
i will love you forever poems.
ill love you forever and
ill love you forever and
i will love you forever and
I will love you forever and always
Connecting other hard drives. I'm only able to read from most (windows) drives.
FAT32 (File Allocation Table)
iindigo
May 2, 12:11 PM
Uh huh. And OSX doesn't ask you to manually enter a password every time you install or change something? Windows only asks you to authorize...which is technically more "annoying"?
I don't know about you, but once I have my Mac set up (apps and updates installed) about the only thing I enter my password for is to unlock the screen saver. Maybe for the occasional random app I install or when I need to change an otherwise permissions-locked file. It's not a super common thing and if a password dialog pops up for seemingly no reason it sends up a red flag.
As for which is more obnoxious, I'd have to say UAC by far. As noted previously, the user is prompted with UAC for many things you'd never see a password dialog in OS X or Linux for. This is partially because due to a design flaw in Windows, many third-party applications won't even run unless they have administrator access (silly, no?).
I actually don't know anyone who has ever disabled UAC.
Our experiences differ, then. A good half or more of the students at my college have theirs disabled. The reason always cited is, "because it was annoying".
I don't know about you, but once I have my Mac set up (apps and updates installed) about the only thing I enter my password for is to unlock the screen saver. Maybe for the occasional random app I install or when I need to change an otherwise permissions-locked file. It's not a super common thing and if a password dialog pops up for seemingly no reason it sends up a red flag.
As for which is more obnoxious, I'd have to say UAC by far. As noted previously, the user is prompted with UAC for many things you'd never see a password dialog in OS X or Linux for. This is partially because due to a design flaw in Windows, many third-party applications won't even run unless they have administrator access (silly, no?).
I actually don't know anyone who has ever disabled UAC.
Our experiences differ, then. A good half or more of the students at my college have theirs disabled. The reason always cited is, "because it was annoying".
puma1552
Mar 12, 03:43 AM
Oh cr*p. The headline is 'huge explosion'.
I think it's clearly time to start making comparisons with Chernobyl and discussing how widespread the radiation damage is now potentially gong to be rather than praising how Japanese reactors are different to Soviet ones. That huge cloud of smoke is enough to tell anyone expert or not that this is already way beyond just getting backup cooling diesel generators operational again - we're witnessing a massive disaster genuine bona fide China Syndrome meltdown.
Why is this Chernobyl?
What are the similarities?
What are the differences?
What's your background?
Do you understand why Chernobyl is uninhabitable for several hundred years, while Hiroshima and Nagasaki are thriving, gorgeous cities?
Did you freak out at the "1000x" radiation levels too, like the rest of the western media did who didn't have the remotest clue that it was still magnitudes below the hazardous level? You certainly buy into the "Huge Explosion!!!" headlines, as evidenced by your post, so it's hard to take anything you say seriously.
It's a serious situation, but you are panicking a little too much, with next to zero information.
I think it's clearly time to start making comparisons with Chernobyl and discussing how widespread the radiation damage is now potentially gong to be rather than praising how Japanese reactors are different to Soviet ones. That huge cloud of smoke is enough to tell anyone expert or not that this is already way beyond just getting backup cooling diesel generators operational again - we're witnessing a massive disaster genuine bona fide China Syndrome meltdown.
Why is this Chernobyl?
What are the similarities?
What are the differences?
What's your background?
Do you understand why Chernobyl is uninhabitable for several hundred years, while Hiroshima and Nagasaki are thriving, gorgeous cities?
Did you freak out at the "1000x" radiation levels too, like the rest of the western media did who didn't have the remotest clue that it was still magnitudes below the hazardous level? You certainly buy into the "Huge Explosion!!!" headlines, as evidenced by your post, so it's hard to take anything you say seriously.
It's a serious situation, but you are panicking a little too much, with next to zero information.
retroneo
Oct 7, 08:29 PM
For example, every phone manufacturer is going to have their own set of features. Some may have cameras, vibration, video playback, etc. With the iPhone, you know exactly what is there and what the device you're targeting can do. You can build better applications to utilize the specific hardware.
Of the 6 iPhone OS devices so far released (still more than Android), each has their own set of features. Some may have cameras, vibration, video playback, etc. There is also an enourmous range of CPU and GPU ability. I think the only consistent thing so far has been the screen size and the fact that apps can only use touch and none of the buttons.
So there is a similar (smaller) problem that exists for developers on iPhone. It's unfortunately why Firemint say they won't release Real Racing 3GS too. Android tries to keep fragmentation to a minimum by running everything in a virtual machine but ultimately it has the same problem.
These aren't game consoles that are released once every 5 years.
Of the 6 iPhone OS devices so far released (still more than Android), each has their own set of features. Some may have cameras, vibration, video playback, etc. There is also an enourmous range of CPU and GPU ability. I think the only consistent thing so far has been the screen size and the fact that apps can only use touch and none of the buttons.
So there is a similar (smaller) problem that exists for developers on iPhone. It's unfortunately why Firemint say they won't release Real Racing 3GS too. Android tries to keep fragmentation to a minimum by running everything in a virtual machine but ultimately it has the same problem.
These aren't game consoles that are released once every 5 years.
camomac
Jul 14, 02:12 PM
ahhh, why didn't they have dual optical slots in the current G5's..
too much heat from the PPC's and all those fans?
well i am really looking forward to the new look.
too much heat from the PPC's and all those fans?
well i am really looking forward to the new look.
blackstarliner
Sep 21, 03:44 AM
airport express and airtunes allowed streaming content to a stereo. this just adds video function. that's it. if there is a hd it's for buffer and basic OS/ navigation.
still a very cool solution to sending content
yes, but it also may have the functionality to browse and download content directly... maybe
still a very cool solution to sending content
yes, but it also may have the functionality to browse and download content directly... maybe
AidenShaw
Oct 8, 10:23 AM
Faster at what? I'm too lazy to find the part in the keynote where they showed this. Was it 20% faster at something designed to use all 8 cores?
The task was a multi-threaded matrix multiplication that easily scales to multiple cores.
This is representative of many HPC and rendering apps, but not as realistic for most desktop apps (unless, of course, you're like MultiMedia and run several separate instances of the desktop apps simulataneously).
The sections in the video are at 11:50 to 15:00, and 26:30 to 28:00. (The gap is while the engineer is swapping CPUs and rebooting.)
My earlier numbers were a bit off - rewatching the video the Woodie system was 40% faster than the Opteron, at 17% less power. The Clovertowns were low-voltage parts "about 900MHz" slower than the Woodies. The octo (dual quads) was about 60% faster than the Opteron at 17% less power. (I'd like to have seen them put in faster Clovertowns, and show what the octo Clovertown would do when matching the power draw of the Opteron.)
At about 25:00 minutes in, Gelsinger says that the "two woodies in one socket" is the "right way to do quad-core at 65nm", due to manufacturing and yield issues.
The task was a multi-threaded matrix multiplication that easily scales to multiple cores.
This is representative of many HPC and rendering apps, but not as realistic for most desktop apps (unless, of course, you're like MultiMedia and run several separate instances of the desktop apps simulataneously).
The sections in the video are at 11:50 to 15:00, and 26:30 to 28:00. (The gap is while the engineer is swapping CPUs and rebooting.)
My earlier numbers were a bit off - rewatching the video the Woodie system was 40% faster than the Opteron, at 17% less power. The Clovertowns were low-voltage parts "about 900MHz" slower than the Woodies. The octo (dual quads) was about 60% faster than the Opteron at 17% less power. (I'd like to have seen them put in faster Clovertowns, and show what the octo Clovertown would do when matching the power draw of the Opteron.)
At about 25:00 minutes in, Gelsinger says that the "two woodies in one socket" is the "right way to do quad-core at 65nm", due to manufacturing and yield issues.
dgree03
Apr 28, 09:30 AM
Let me try to explain what I mean from a different angle:
The number of PCs being sold could remain constant and still fall behind tablet sales in the future. Why? The market expands. Think about who could use a mainframe back in the day. Very few companies. Then minicomputers came along and suddenly many more companies could get one. The market expanded, and even if mainframe sales remained constant, minicomputer sales surpassed them.
Tablets will appeal to those who never got comfortable with PCs. Or who never bothered getting one at all. I've personally seen toddlers and 80-year-olds gravitate toward the iPad naturally. It just fits them perfectly. There's none of that artificial abstraction of a keyboard or mouse between their fingers and the device, they just interact directly. It appeals to them.
Someone who uses a PC almost exclusively for email and web surfing will find a tablet appealing to them.
Programmers and professional writers used to keyboards will not find a tablet appealing to them. Not yet, at least.
So when the market balloons yet again to take in the Tablet Era, PCs will continue to be sold, but the number of users in this new market will be larger than the market that existed in the PC Era. Many PC users will move to tablets, and many folks who never enjoyed (or even used) PCs will grab a tablet. It will be bigger than the PC market by 2020.
And by the way, the price premium referred to earlier in this thread? That's unique to Macs versus PCs because Apple does not compete in the low-end of the market. But in the smart phone and tablet markets, there is NO price premium. One day people will forget that Apple ever made "high-priced" items since it simply won't be true compared with the competition.
As for Apple never making headway, they are merely the most profitable computer company on the planet. Nice lack of headway if you can get it.
Oh i completely understand what you mean, thanks for the further clarification.
Lets not forget that we are dealing with a more "computer" savvy generation. Your examples of 80yr olds and infants is generally correct, but when those infants get to school, they will be using desktops(at school.) I think the barrier that existed with PC emergence in the late 80's is still prevalent today, not with the youger crowd anyway.
I think it will get to the point where people will have multiple devices in their homes. Just like people have laptops, desktops, and tablets(like myself) They will each have a place, but I just dont think tablets will run desktops and laptops out of peoples homes and time in the next 10-15 years.
The number of PCs being sold could remain constant and still fall behind tablet sales in the future. Why? The market expands. Think about who could use a mainframe back in the day. Very few companies. Then minicomputers came along and suddenly many more companies could get one. The market expanded, and even if mainframe sales remained constant, minicomputer sales surpassed them.
Tablets will appeal to those who never got comfortable with PCs. Or who never bothered getting one at all. I've personally seen toddlers and 80-year-olds gravitate toward the iPad naturally. It just fits them perfectly. There's none of that artificial abstraction of a keyboard or mouse between their fingers and the device, they just interact directly. It appeals to them.
Someone who uses a PC almost exclusively for email and web surfing will find a tablet appealing to them.
Programmers and professional writers used to keyboards will not find a tablet appealing to them. Not yet, at least.
So when the market balloons yet again to take in the Tablet Era, PCs will continue to be sold, but the number of users in this new market will be larger than the market that existed in the PC Era. Many PC users will move to tablets, and many folks who never enjoyed (or even used) PCs will grab a tablet. It will be bigger than the PC market by 2020.
And by the way, the price premium referred to earlier in this thread? That's unique to Macs versus PCs because Apple does not compete in the low-end of the market. But in the smart phone and tablet markets, there is NO price premium. One day people will forget that Apple ever made "high-priced" items since it simply won't be true compared with the competition.
As for Apple never making headway, they are merely the most profitable computer company on the planet. Nice lack of headway if you can get it.
Oh i completely understand what you mean, thanks for the further clarification.
Lets not forget that we are dealing with a more "computer" savvy generation. Your examples of 80yr olds and infants is generally correct, but when those infants get to school, they will be using desktops(at school.) I think the barrier that existed with PC emergence in the late 80's is still prevalent today, not with the youger crowd anyway.
I think it will get to the point where people will have multiple devices in their homes. Just like people have laptops, desktops, and tablets(like myself) They will each have a place, but I just dont think tablets will run desktops and laptops out of peoples homes and time in the next 10-15 years.
r1ch4rd
Apr 22, 10:05 PM
In some areas of the US people look down on if you admit that you don't believe in God. People can be very vicious about it and at the work place it's best not to voice your opinion or the Christians will gang up against you. I've seen this happen several times.
That's a real shame and I hope that improves for you. I am proud that we appear to be more open minded on this side of the pond. I have had plenty of people disagree with me, but we can agree to accept our differences.
I was once pointed to an interesting indication of the difference in culture. In the USA I believe the $1 bill contains the phrase "In God We Trust". In the UK, we have Charles Darwin on our currency! He appears on the �10 note and a recent �2 coin. The �2 coin changes fairly regularly though.
That's a real shame and I hope that improves for you. I am proud that we appear to be more open minded on this side of the pond. I have had plenty of people disagree with me, but we can agree to accept our differences.
I was once pointed to an interesting indication of the difference in culture. In the USA I believe the $1 bill contains the phrase "In God We Trust". In the UK, we have Charles Darwin on our currency! He appears on the �10 note and a recent �2 coin. The �2 coin changes fairly regularly though.
wdogmedia
Aug 29, 03:52 PM
Even if, which I doubt, your theory of water vapour is correct - that does not give us the excuse to pollute this planet as we see fit. All industry and humans must clean up their act - literally.
Some of what I said was theory, but every factual statement I gave was just that - factual. No climatologist would argue with any of the facts I gave...it's just that, as with statistics, the interpretation of the fact differs.
And no, we have no excuse to pollute the planet....human actions proven to disrupt the environment (deforestation, toxic runoff, killing off animal species, etc.) should be stopped whenever possible. We are responsible for taking care of this planet, but at the same time we have to realize when advancements have been made. Our cars, boats, factories and city skies are infinitely more environmentally-friendly than they used to be, but if 30 years of industrial and personal "clean-up" have done nothing to stem global warming, it's only natural to wonder if maybe it's not us causing the problem.
In other words, if we've streamlined our machinery to be 99% more efficient, is it worth it to spend the billions of dollars to get rid of that last 1% if our original effort has done nothing to the greenhouse effect?
Some of what I said was theory, but every factual statement I gave was just that - factual. No climatologist would argue with any of the facts I gave...it's just that, as with statistics, the interpretation of the fact differs.
And no, we have no excuse to pollute the planet....human actions proven to disrupt the environment (deforestation, toxic runoff, killing off animal species, etc.) should be stopped whenever possible. We are responsible for taking care of this planet, but at the same time we have to realize when advancements have been made. Our cars, boats, factories and city skies are infinitely more environmentally-friendly than they used to be, but if 30 years of industrial and personal "clean-up" have done nothing to stem global warming, it's only natural to wonder if maybe it's not us causing the problem.
In other words, if we've streamlined our machinery to be 99% more efficient, is it worth it to spend the billions of dollars to get rid of that last 1% if our original effort has done nothing to the greenhouse effect?
Penfold2711
Apr 21, 07:02 AM
I love the title simply because it reads like its discussing Steve Jobs' involvement in fragmenting Android :D
Maybe thats why Steve has gone missing he's on a secret mission I can imagine Steve in dark glasses a trench coat and a hat running around google HQ with his macbook pro as we speak :D
Maybe thats why Steve has gone missing he's on a secret mission I can imagine Steve in dark glasses a trench coat and a hat running around google HQ with his macbook pro as we speak :D
tk421
Apr 13, 12:34 PM
Nobody I know that's a professional editor (as opposed to a hobbyist) is very excited. If I had to sum up the opinions in two sentences, it would be: It looks like a mixed bag. I need to hear more.
My thoughts: On the surface, they seem to have addressed a lot of "problems" that didn't exist for me. At the same time, they did NOT address what I found to be the largest shortcomings: Media Management, and Multi-Editor Support. Which leads me to believe that it targets a different audience than I am. For example, I didn't see anything that makes it better for feature film use. But a lot of automated stuff (audio processing, color correction, etc.) will make it better for wedding videos or projects with really small budgets.
Some things, like making audio and video merged in a single track, sound like a drawback, not a feature. But I would have to try it out myself. Maybe it'd be good once I got used to the new way of doing things.
There were some things that sounded good. Utilizing multiple cores, 64 bit, background rendering, editing while ingesting, and PluralEyes-like audio syncing. Of course all this depends on how they're implemented. Just like I might actually like merging audio and video, I might end up not liking these things (for example if you can't disable background rendering). One other "feature" I really like is the price, but that's secondary to the actual functionality.
I guess we'll see. I'm interested in hearing more.
My thoughts: On the surface, they seem to have addressed a lot of "problems" that didn't exist for me. At the same time, they did NOT address what I found to be the largest shortcomings: Media Management, and Multi-Editor Support. Which leads me to believe that it targets a different audience than I am. For example, I didn't see anything that makes it better for feature film use. But a lot of automated stuff (audio processing, color correction, etc.) will make it better for wedding videos or projects with really small budgets.
Some things, like making audio and video merged in a single track, sound like a drawback, not a feature. But I would have to try it out myself. Maybe it'd be good once I got used to the new way of doing things.
There were some things that sounded good. Utilizing multiple cores, 64 bit, background rendering, editing while ingesting, and PluralEyes-like audio syncing. Of course all this depends on how they're implemented. Just like I might actually like merging audio and video, I might end up not liking these things (for example if you can't disable background rendering). One other "feature" I really like is the price, but that's secondary to the actual functionality.
I guess we'll see. I'm interested in hearing more.