• DreamHost


  • » Currently browsing posts tagged with: Motorola

    Newsletter Issue #1018: Random Thoughts on the Upcoming Apple Silicon Macs

    May 25th, 2021

    As you might expect, the skeptics are looking hard to find problems with Apple’s first generation Macs with the M1 chip. They need something to do, but other than app developers who haven’t upgraded their goods to the new silicon, and a few glitches here and there, the rollout has been quite seamless. What’s more, high Mac sales clearly indicate customers are pleased, or at least the changes aren’t impediments to buying new gear.

    Now I’m sure most people who purchase new Macs aren’t concerned so much about the fine details of a new processor architecture. That’s all about we geeks getting involved in the nuts and bolts and Apple’s design choices.

    For the first release of the M1 Mac mini, MacBook Air and 13-inch MacBook Pro, Apple followed the same tact used in the transitions from Motorola to PowerPC and from PowerPC to Intel. The external designs were virtually identical to the models they replaced except for the new hardware. As a practical matter, most everything you did to make the new Macs run was the same as the older Mac. The 24-inch iMac represents the first change, to a thinner, lighter form factor — and they come in colors, which makes it sort of a throwback to the second generation iMacs from over 20 years ago.

    Continue Reading…

    Share


    Newsletter Issue #987: Apple and the Failing Upgrade Argument

    April 17th, 2020

    The best way for me to put this in perspective is to turn back the hands of time to my early days as the owner of a personal computer, in the mid-1980s. In those days, I was in a position to upgrade frequently, even though it was fair to say that I could put off some of those purchases for a while without my workflow suffering.

    Really, the main improvement for me was the upgrade from a 14-inch Apple color display to a 19-inch something-or-other. As an historical aside, that original 14-inch display soon became a 13-inch display because of a revision in the way display size was calculated.

    In any case, I took advantage of my status as a tech journalist to upgrade Macs every year or so. In large part, the performance improvement was worth it more or less; that is, until the Power Macs arrived in 1994. With the promise of a high-performance RISC processor from IBM and Motorola, Macs held the promise of achieving amazing performance, but it didn’t quite work out that way.

    Continue Reading…

    Share


    Revisiting Mac on ARM

    April 6th, 2018

    I have lived through all the major Mac processor transitions. Makes me feel old. First it was the Motorola 680×0 series, followed by the PowerPC and, by 2006, Intel.

    Overall, the last one went pretty well. There was a way to run PowerPC software for a few years, courtesy of something called Rosetta. It was pretty decent from a performance standpoint, unlike the 680×0 emulator, which suddenly put you a couple of generations behind in terms of how well the apps ran until they went PowerPC. But until the new apps arrived, the all-new RISC architecture didn’t seem so impressive.

    So is Apple planning yet another processor switchover? Well, consider how Apple has managed to deliver its A-series processors with huge performance boosts every year, very noticeable with most apps.

    Compare that to new Intel processor families that might be measurably more powerful than the previous generation, but the performance advantages are often barely noticeable without a scorecard. Apple’s advantage was to create an ARM-based processor family that took direct advantage of iOS. It wasn’t bogged down with legacy support for things that never existed on an Apple platform, making for more efficiency.

    So does Apple have a Mac on ARM in its future? Microsoft tried Windows RT (on ARM) without a whole lot of success, but perhaps its second try will fare better.

    Using Apple’s Xcode, it shouldn’t be such a big deal for developers to go with the transition to ARM, and allow developers to build flat binaries for that and Intel. Recent rumors have it that you’ll be able to run iOS apps on Macs, and vice versa, more or less. The Touch Bar on the latest MacBook Pros runs with a second processor on that computer, an A-series system-on-a-chip. A similar scheme is used for low-level functions on the iMac Pro,

    So Apple is clearly taking you partway already. How long will it require for a full shift, and should you such a possibility seriously?

    It’s a romantic ideal, that Apple has full control of more and more of the parts that make up its hardware. It would also allow the Mac to offer far more differences than just a higher-priced PC in a fancy box.

    According to recent reports from reporter Mark Gurman of Bloomberg, the prospective shift may happen beginning in 2020. Take it with a grain of salt for now.

    But can an iPhone or iPad chip really power a Mac with equal or better performance than current models? Consider the benchmarks that show Apple’s mobile hardware exceeding the performance of most notebook PCs and coming up real close to the MacBook Pro. No doubt those CPUs are not running full tilt to lower the drain on resources and battery life. What will those benchmarks be if Apple allowed them to run full bore?

    What about the chips shipping two years from now? Remember, too, Apple already has control of graphics hardware, so what happens to its existing partners, AMD and NVIDIA? Apple probably wouldn’t care if its taking these steps.

    It wasn’t so easy for Apple to persuade developers to adopt PowerPC, but far easier to go to Intel, since there was so much legacy software on the Windows platform. That meant that many developers knew how to optimize their Mac apps for Intel. As I recall, it wasn’t such a difficult move.

    But there was one key advantage of Apple going Intel, other than being assured of regular improvements, more or less, in the chips. It was the ability to run Windows natively with Boot Camp, and at pretty good speed with virtual machines courtesy of such apps as Parallels Desktop.

    If Boot Camp and virtual machines have to run in emulation on one of these new fangled Macs, how much would performance deteriorate? Or would Apple devise ways to work around this, such as licensing some Intel chip functions using the graphics hardware to reduce the performance bottleneck? I would be loathe to predict how it could be done, but if the ARM chips end up significantly faster than Intel counterparts, maybe most people won’t notice much of a difference.

    It wouldn’t take the infamous performance hit of running Windows under emulation the PowerPC. That was just dreadful. I remember opening a document would often take a full minute or two.

    Some suggest that Apple, which has often ditched older technologies without apology, might just give up on the concept of running Windows on a Mac. But I suspect lots of users still need that feature. I also suspect that Apple is quite capable of devising a solution that wouldn’t hurt performance in any particularly noticeable way.

    But this all needs a reality check. That Apple could make this change doesn’t mean it will. It might very well be that Intel’s existing hardware roadmap is a viable solution, without saddling Apple with the development costs of a new processor transition. But there are good reasons for consistent hardware across its major platforms. If the annual improvements in Apple’s A-series CPUs continue to provide healthy two-digit performance boosts, maybe it will happen after all.

    I’m skeptical, but with Apple, never say never, particularly if Intel confronts any serious headwinds in improving its chips going forward.

    Share


    The CPU Bug: macOS and iOS Users Get Off Easy

    January 10th, 2018

    As Apple patches serious long-standing CPU bugs on its iOS, tvOS and macOS gear, they claim you shouldn’t notice any performance dip. Well, maybe a little in Safari, where, after installing the Spectre fix, Apple reported that one web benchmark was reduced by 2.5%. But you can get on with your life as Apple continues to “mitigate” the problem.

    But there have been concerns, because Intel claimed that the impact would be in the range of 5-30%. So if it’s not so much on Apple’s platforms, what about the rest of the computing world?

    Well, according to Microsoft, PC users with older processors and operating systems will pay the price and see lower performance.

    Microsoft claims that people running Windows 10 and PCs with 2016 or later Intel hardware, should only see slowdowns in the single digits. But that means that people with older gear may suffer from far worse results. Microsoft has yet to publish detailed benchmarks, but tech sites have already begun to run their own.

    I did catch a set of benchmarks at one site, but it appeared to involve recent or current hardware, and the impact was either insignificant or in the range of 1-4%. Another set of tests at a second site yielded similar results across a battery of benchmarks. Again they involved recent hardware, and that appears to confirm Microsoft’s conclusion that newer PCs would suffer minor losses, probably not noticeable under normal use. But I’d like to see what happens with older hardware, the PCs you’d normally see at many businesses.

    Cloud systems may exhibit a worse impact, however. According to Epic games, it’s servers received Meltdown patches, and suffered from a 20% increase in CPU utilization as a result. The Linux server we use for these sites was patched recently and restarted today. I have yet to observe any increase in server load and performance of our sites appears normal. But the system is usually under especially heavy load on Sundays and Mondays, after my two radio shows are posted. I’ll see what happens then.

    Obviously if you’re using using Microsoft Office, checking email or using a browser, even a more significant reduction in overall CPU performance may go largely unnoticed. The main impact will be when the hardware is pressed to work harder.

    Unfortunately, users with PCs running older generation AMD CPUs may be essentially bricked after the Windows patch for the Spectre bug is installed. Last I heard was that AMD was working with Microsoft to fix the problem, but that means an unknown number of computers were left unusable. When these machines, running Windows 7 or Windows 10, were started up, they’d stall at the startup logo.

    I suppose you could say that Apple got off pretty much scot free on this one, except that more fixes are expected, and it’s possible any of them might have a negative impact. It’s not that Apple hasn’t suffered from flawed updates.

    In the meantime, it appears that Google and its Android vendors, such as LG, Motorola and Samsung, are working to patch their products. Recent Nexus and Pixel smartphones may have already been updated, or will be shortly.

    Regardless, you won’t just suffer from Meltdown and Spectre without doing something. You need to install malware that has been programmed to take advantage of these flaws. Since the bugs went undiscovered for over 20 years, any exploits would have to be of recent duration.

    So it goes back to practicing safe computing. Unless you jailbreak your iOS device, there shouldn’t be anything to worry about. Mac users need to download software mainly from well-known sites, or the Mac App Store. Clicking on unknown email links or visiting sites that are off the beaten track could leave you vulnerable to some sort of intrusion, but that’s always been true.

    In any case, I’m not altogether certain how this all remained undiscovered for so long. One of my colleagues suggested the use of sandboxing in recent years created the climate where these bugs could be exploited. But you’d think hackers and security researchers have been regularly kicking the tires of operating systems and hardware for years in search of undiscovered flaws.

    Why so long?

    I don’t want to fear-monger, but if such serious bugs could exist for two decades before being found, just what else is lurking deep within the recesses of those millions of transistors that make up modern computing chips? Will other defects turn up next? Obviously, keeping abreast of security flaws is a 24/7 job, and there are always discoveries and fixes posted by Apple and other companies.

    What it goes to show is that no computer is perfect. There are always going to be glitches along the way. Some will impact performance, or cause unstable behavior. Others will, as you’ve seen, open up the hardware or software to security exploits of one sort or another.

    As new operating system features are tested, they might reveal other previously undiscovered flaws. At the same time, perhaps this episode will serve as an object lesson for Intel, AMD, ARM and other hardware makers to redouble their efforts to make the products installed in billions of homes and businesses as safe as possible.

    Share