Apple to start making their own CPU’s, about time

Discussion in 'General Off Topic Forums' started by transmaster, Apr 3, 2018.

  1. transmaster

    transmaster Addicted Member

    Messages:
    7,221
    Location:
    Cheyenne, Wyoming
    Last edited: Apr 3, 2018
    oldgringo likes this.

     

    Please register to disable this ad.

  2. FONSguy

    FONSguy Super Member

    Messages:
    2,032
    Location:
    Sterling, VA
    I thought they were kinda doing that with the 68030 chips back in the day.
     
  3. quaddriver

    quaddriver 120 What's per channel Subscriber

    that wasa 'rola chip. it was in apples, atari, commodore, suns etc.

    if apple leasing a design or making their own? its not childs play and it requires a few mil for the machine just to set up the masks...apple is still a computer company? ;-)
     
  4. buglegirl

    buglegirl In The Direction Of The Singularity Subscriber

    Messages:
    11,475
    Location:
    Mid Atlantic
    Hello,

    From what I understand its a ARM type of chip much like they use in Phones , Tablets now.

    Hard for me to imagine how they are going to get Mac Pro performance out of it.

    And I do worry that this is a way to invalidate all Intel Based Mac's much like they pulled the rug out from under the PowerPC machines back in 2005.

    Mind you I get why the move away from Intel. Just maybe they should have bought AMD instead.

    Frannie
     
    botrytis likes this.
  5. transmaster

    transmaster Addicted Member

    Messages:
    7,221
    Location:
    Cheyenne, Wyoming
    Apple is already building the “A” series CPU’s that are on iOS devices my 7+ has an Apple A10 Fusion. It is a quad core 2.34 GHz 64bit 16nm cpu. It is based on the ARM CPU But is manufactured by Apple. Apple has been moving in this direction for a number of years.
     
  6. buglegirl

    buglegirl In The Direction Of The Singularity Subscriber

    Messages:
    11,475
    Location:
    Mid Atlantic
    Still I would not render complex video or 3D images on a IOS device.

    They are going to have to really step it up with the ARM platform to make this viable.

    Frannie
     
    botrytis likes this.
  7. transmaster

    transmaster Addicted Member

    Messages:
    7,221
    Location:
    Cheyenne, Wyoming
    When Apple dropped the Motorola 68030, PowerPC chip those of us who were long standing fans of AMD processors were very disappointed when Apple went with Intel.[/QUOTE]
     
  8. transmaster

    transmaster Addicted Member

    Messages:
    7,221
    Location:
    Cheyenne, Wyoming
    You should see what the current iPad Pro, iPhone does with it’s powerVR GPU’s. As for the Mac’s they will continue to use the Radeon GPU’s.
     
    Last edited: Apr 3, 2018
  9. botrytis

    botrytis Trying not to be a Small Speaker Hoarder Subscriber

    Sorry, ARM are not a very useful CPU and the power VR GPU is a joke. 3D graphics - nope. Run a VR headset - nope. An ARM processor is not advanced enough to do what the regular Intel CPU's are doing for Apple. I expect them to drop OS X and just have iOS for all.

    I doubt Apple will continue with the Radeon GPU's. They want to control everything on their space, so they will also start to make mediocre GPU's.
     
    Bill Ferris and jami w. like this.

     

    Please register to disable this ad.

  10. transmaster

    transmaster Addicted Member

    Messages:
    7,221
    Location:
    Cheyenne, Wyoming
    This is what the the current iPad Pro can do. I don’t think you need to wonder about what an iPad can do. If Apple can come up with something better then the Radeon, or the Nividia Great.

     
    Last edited: Apr 3, 2018
  11. botrytis

    botrytis Trying not to be a Small Speaker Hoarder Subscriber

    And? If that is all - sorry colour me not impressed.
     
    Bill Ferris and jami w. like this.
  12. transmaster

    transmaster Addicted Member

    Messages:
    7,221
    Location:
    Cheyenne, Wyoming
    I get the idea that Apple could present alien technology from the planet Mongo, and Dr. Hans Zarkov and you would not be impressed, I get it.
     
    Cadillac Kid and beat_truck like this.
  13. hatrack71

    hatrack71 distracted by everything

    Messages:
    5,300
    Location:
    Helena, Montana
    You really think they'll drop the Xeon iMac altogether? The new 27" is pretty sweet. I don't see anything as viable as Intel in chipsets for the type of server grade machines Apple has been turning out. AMD is as vulnerable security wise so I'd see no advantage there. I always thought the 2006-2012 MacPro Xeons and iMacs equipped with the i5/i7 were the best! That's actually when I started liking Apple again. Or it could be the aluminum. Guess they'll become collector's items now. Maybe Apple wants to go fully mobile and quit making desktops. I'm actually really bummed as I hadn't heard this til now. :^(
     
    Last edited: Apr 3, 2018
  14. botrytis

    botrytis Trying not to be a Small Speaker Hoarder Subscriber

    No - it is just that rather than being innovators they are copiers of technology, much like the Japanese after WWII. Apple has not invented anything of note at all. They just copy, tweak, and remarket for more profit.

    You like them, I get it. All I see is a more closed system that is just getting even more closed.
     
  15. buglegirl

    buglegirl In The Direction Of The Singularity Subscriber

    Messages:
    11,475
    Location:
    Mid Atlantic
    ^^^^ This!!!

    And that makes perfect sense to me as that really is where the world is going. I could see a very near future of VR , Mobile VR and specialized computing devices for everything else.

    And as far as Apple not inventing anything at all ?? That is kinda harsh.

    Please other than UNIX and most of what was developed during the Mainframe , Mini Computer era is "New And Unique" by anyone ??

    And don't tell me Microsoft Windows or PC's

    Frannie
     
  16. botrytis

    botrytis Trying not to be a Small Speaker Hoarder Subscriber

    Not saying that either Frannie. You know the old saying, 'What is old is new again'.
     
  17. quaddriver

    quaddriver 120 What's per channel Subscriber

    some truth in that.

    back in the glory daze, at IBM our chief architect (for our os, not IBM) had put forth a paper that of the tiers, T1 = mainframe, T2 = minis, T3 = workstation, T3 had become so advanced that T2 was essentially moving to the workstation.

    That was borne out in the 2000's, 010's and in the last few years, the original T3 users - end users - you all - have moved to the phone. PC and laptop sales are in the toilet US wide, yet phones...yee haw (and phone like things like tablets and stuff)

    not totally unlike the AK theory that to most people, a stereo is an ipod and earbuds...to most a computer is the phone. try typing a term paper on it....oh wait, schools dont do that anymore. makes our kids seem intelligent. cant have that.

    ps frannie, one of my Sr year thesis was there would be no new OS. Im not talking rebranding a unix kernel and calling it android or whatever, a complete new OS with new memory models and dispatching etc. I wrote that 31 years ago. it seems that 'whats it REALLY doing?' is no longer an important or interesting question.
     
    Bill Ferris likes this.

     

    Please register to disable this ad.

  18. transmaster

    transmaster Addicted Member

    Messages:
    7,221
    Location:
    Cheyenne, Wyoming
    I have been into computers since about 1970. I have seen just about everything. I had my first Apple Computer in 1984, but it didn't have the engineering software I needed so I moved on to computers that did. Quaddriver I never really thought about it but OS's have not changed much in the last 30 years or so, the UI has been vastly improved but even that has stopped changing. The next thing we are moving into is VR. Really the future of computing can be found into the anime series Ghost In A Shell.

    One of the memories I have in my early computers days was a graduate student that was bitching and moaning that it was taking the computer, an IBM mainframe at the University of Wyoming. 3 whole hours to do some mathematical problem. A mathematics professor, a lady, turned on this clown and chewed him out pointing out the problem he was complaining about took several months if solved with slide rules.
     
  19. Pete B

    Pete B AK Member Subscriber

    Messages:
    1,625
    Location:
    CT, USA
    I worked in CPU design and one of the golden rules is to maintain binary compatibility with
    each new generation. IBM, DEC, and most others did this. Apple did not when they jumped
    from 68K to PowerPC, and then again to X86. Now they will do it again?
    I wish someone would resurect the DEC Alpha chip.
     
  20. transmaster

    transmaster Addicted Member

    Messages:
    7,221
    Location:
    Cheyenne, Wyoming
    I have always had mixed emotions when Apple dropped the PowerPC chip, this sort of made Apple unique but by the time they dropped it the PowerPC's actual benchmarks were falling behind both Intel, and AMD. What was a disappointment to a lot of us was going to Intel, instead of AMD. Now I understand why they didn't AMD did not have the production capacity of Intel at the time. The CPU will be a version of the ARM based "A" family if rumors are to be believed. As for backward compatibility Apple has a history of going forward and not making the same mistake that Windows has with legacy coding.
     

Share This Page