Apple to start making their own CPU’s, about time

I thought they were kinda doing that with the 68030 chips back in the day.
that wasa 'rola chip. it was in apples, atari, commodore, suns etc.

if apple leasing a design or making their own? its not childs play and it requires a few mil for the machine just to set up the masks...apple is still a computer company? ;-)
 
Hello,

From what I understand its a ARM type of chip much like they use in Phones , Tablets now.

Hard for me to imagine how they are going to get Mac Pro performance out of it.

And I do worry that this is a way to invalidate all Intel Based Mac's much like they pulled the rug out from under the PowerPC machines back in 2005.

Mind you I get why the move away from Intel. Just maybe they should have bought AMD instead.

Frannie
 
Apple is already building the “A” series CPU’s that are on iOS devices my 7+ has an Apple A10 Fusion. It is a quad core 2.34 GHz 64bit 16nm cpu. It is based on the ARM CPU But is manufactured by Apple. Apple has been moving in this direction for a number of years.
 
Apple is already building the “A” series CPU’s that are on iOS devices my 7+ has an Apple A10 Fusion. It is a quad core 2.34 GHz 64bit 16nm cpu. It is based on the ARM CPU But is manufactured by Apple. Apple has been moving in this direction for a number of years.
Still I would not render complex video or 3D images on a IOS device.

They are going to have to really step it up with the ARM platform to make this viable.

Frannie
 
Mind you I get why the move away from Intel. Just maybe they should have bought AMD instead.

Frannie

When Apple dropped the Motorola 68030, PowerPC chip those of us who were long standing fans of AMD processors were very disappointed when Apple went with Intel.[/QUOTE]
 
Still I would not render complex video or 3D images on a IOS device.

They are going to have to really step it up with the ARM platform to make this viable.

Frannie
You should see what the current iPad Pro, iPhone does with it’s powerVR GPU’s. As for the Mac’s they will continue to use the Radeon GPU’s.
 
Last edited:
This is what the the current iPad Pro can do. I don’t think you need to wonder about what an iPad can do. If Apple can come up with something better then the Radeon, or the Nividia Great.

 
Last edited:
Still I would not render complex video or 3D images on a IOS device.

They are going to have to really step it up with the ARM platform to make this viable.

Frannie

You really think they'll drop the Xeon iMac altogether? The new 27" is pretty sweet. I don't see anything as viable as Intel in chipsets for the type of server grade machines Apple has been turning out. AMD is as vulnerable security wise so I'd see no advantage there. I always thought the 2006-2012 MacPro Xeons and iMacs equipped with the i5/i7 were the best! That's actually when I started liking Apple again. Or it could be the aluminum. Guess they'll become collector's items now. Maybe Apple wants to go fully mobile and quit making desktops. I'm actually really bummed as I hadn't heard this til now. :^(
 
Last edited:
Maybe Apple wants to go fully mobile and quit making desktops.
^^^^ This!!!

And that makes perfect sense to me as that really is where the world is going. I could see a very near future of VR , Mobile VR and specialized computing devices for everything else.

And as far as Apple not inventing anything at all ?? That is kinda harsh.

Please other than UNIX and most of what was developed during the Mainframe , Mini Computer era is "New And Unique" by anyone ??

And don't tell me Microsoft Windows or PC's

Frannie
 
^^^^ This!!!

And that makes perfect sense to me as that really is where the world is going. I could see a very near future of VR , Mobile VR and specialized computing devices for everything else.

And as far as Apple not inventing anything at all ?? That is kinda harsh.

Please other than UNIX and most of what was developed during the Mainframe , Mini Computer era is "New And Unique" by anyone ??

And don't tell me Microsoft Windows or PC's

Frannie

some truth in that.

back in the glory daze, at IBM our chief architect (for our os, not IBM) had put forth a paper that of the tiers, T1 = mainframe, T2 = minis, T3 = workstation, T3 had become so advanced that T2 was essentially moving to the workstation.

That was borne out in the 2000's, 010's and in the last few years, the original T3 users - end users - you all - have moved to the phone. PC and laptop sales are in the toilet US wide, yet phones...yee haw (and phone like things like tablets and stuff)

not totally unlike the AK theory that to most people, a stereo is an ipod and earbuds...to most a computer is the phone. try typing a term paper on it....oh wait, schools dont do that anymore. makes our kids seem intelligent. cant have that.

ps frannie, one of my Sr year thesis was there would be no new OS. Im not talking rebranding a unix kernel and calling it android or whatever, a complete new OS with new memory models and dispatching etc. I wrote that 31 years ago. it seems that 'whats it REALLY doing?' is no longer an important or interesting question.
 
I have been into computers since about 1970. I have seen just about everything. I had my first Apple Computer in 1984, but it didn't have the engineering software I needed so I moved on to computers that did. Quaddriver I never really thought about it but OS's have not changed much in the last 30 years or so, the UI has been vastly improved but even that has stopped changing. The next thing we are moving into is VR. Really the future of computing can be found into the anime series Ghost In A Shell.

One of the memories I have in my early computers days was a graduate student that was bitching and moaning that it was taking the computer, an IBM mainframe at the University of Wyoming. 3 whole hours to do some mathematical problem. A mathematics professor, a lady, turned on this clown and chewed him out pointing out the problem he was complaining about took several months if solved with slide rules.
 
I worked in CPU design and one of the golden rules is to maintain binary compatibility with
each new generation. IBM, DEC, and most others did this. Apple did not when they jumped
from 68K to PowerPC, and then again to X86. Now they will do it again?
I wish someone would resurect the DEC Alpha chip.
 
I have always had mixed emotions when Apple dropped the PowerPC chip, this sort of made Apple unique but by the time they dropped it the PowerPC's actual benchmarks were falling behind both Intel, and AMD. What was a disappointment to a lot of us was going to Intel, instead of AMD. Now I understand why they didn't AMD did not have the production capacity of Intel at the time. The CPU will be a version of the ARM based "A" family if rumors are to be believed. As for backward compatibility Apple has a history of going forward and not making the same mistake that Windows has with legacy coding.
 
One of the memories I have in my early computers days was a graduate student that was bitching and moaning that it was taking the computer, an IBM mainframe at the University of Wyoming. 3 whole hours to do some mathematical problem. A mathematics professor, a lady, turned on this clown and chewed him out pointing out the problem he was complaining about took several months if solved with slide rules.

I was taking some numerical analysis course senior year college (that was fun - not) and what we were doing was showing just that, how some computations of classic theories take a long time to do (this was 1987 mind you)

I forget the problem du jour, but we had access to the NSF's Cray xmp48 in pittsburgh. We had to use any language we desired, and solve it on AT&T 6030 pcs which were an 8mhz 8086 with a math coprocessor, a vax 8600 series, and the cray.

So ok, the 6030's ran all night, you had to kick it off and let it chew and hope it didnt crash over night.

The vax, took about an hour (both of these were in pascal, turbo 3.0 on the 6030 under dos and whatever was vogue on the vax)

Then on the cray, it was a batch machine, so you built the program, debugged it and sent the executable, datafile, and an output file to write data into, all as a package to the cray. when the vax sent it over, you got 3 messages indicating transmission...vax to cray.... vax to cray....vax to cray like that for each file

upon completion, you got cray to vax (program back with notes where it optimized it), cray to vax (data file back touched only with an indication of how far it got into it) and lastly cray to vax, the output file with either errors or the answer.

so I kick it off, on my terminal, I saw this:
vax to cray....
vax to cray....
vax to cray....
cray to vax....
cray to vax....
cray to vax....

all within 20 seconds.

it impressed me at the time....
 
As for backward compatibility Apple has a history of going forward and not making the same mistake that Windows has with legacy coding.

The adoption rate and viability of Windows is much greater for a reason you know.

I get your point but if I was a large business I would NEVER recommend Apple products because of this constant upheaval however well intentioned.

Again I get your point to a degree but Apple with this is already invalidating most of the desktop market they have. Who would buy an Apple Desktop knowing that its going to be an orphan in a year or so.

And don't come back with Apple will make sure these are updated. Yes updated but dead as far as new features which will be the carrot to lure people to pay to replace current hardware.

Its smart as long as Windows keeps dundering along. But I gotta figure there is some happy folks in Redmond over this.

Frannie
 
I have always had mixed emotions when Apple dropped the PowerPC chip, this sort of made Apple unique but by the time they dropped it the PowerPC's actual benchmarks were falling behind both Intel, and AMD. What was a disappointment to a lot of us was going to Intel, instead of AMD. Now I understand why they didn't AMD did not have the production capacity of Intel at the time. The CPU will be a version of the ARM based "A" family if rumors are to be believed. As for backward compatibility Apple has a history of going forward and not making the same mistake that Windows has with legacy coding.

well hold on a second, backwards compatibility is EXTREMELY important for end users that have a substantial software investment.

Granted, at some point you have to move on (there did come a point where 8086 stuff would no longer run native on 80x86 machines and 24 bit addresses had to be shelved)

but it would be disastrous to the industry as a whole if a new version of the program had to be purchased with each new machine. back in the day, actually, until a few years ago, you bought software and it was your to keep, or in business environments, OS software was billed using a GML (graduated monthly license) nowadays we rent software which costs you more of course, but it has an upgrade path because each mandatory release contains a data migration tool.

data format among the versions is the biggest bugaboo and there is a LOT of industry code behind the scenes.

apple, feels entitled to not just change versions but architecture. that is the sole reason why those in the industry do not consider apple a real company with real machines. we refer to them backhandedly as iJunk (and there are less than complimentary terms that echo off the walls about the users as well...but yanno)

interestingly, if you walk thru a goodwill store now, you will see DOZENS of docks...why? apple changed the end plug to dock portable iJunk and people who bought these things from various manus, who went thru the mandatory phone upgrade, now have $100 of worthless electronics you cannot sell.

Why people keep patronizing them is beyond comprehension. (I think there is a joke called 'if apple built airplanes....' and the end is, not only would a bunch of stuff change, but each version you would also have to build new airports)
 
As far as Apple "making" new chips, have they actually built or purchased a modern wafer fab? I know they bought an old Maxim fab some years back, but that's not going to produce modern cpus. I can see that they might have a design group that outsources fab to somebody like Samsung. Given Apple's pricing structure, I don't see where outsourcing their design for fab vs using established cpus is going to be that much of an advantage to them. But, what goes on Apple often eludes me - since 1983.
 
Back
Top Bottom