• DreamHost


  • » Currently browsing posts tagged with: Touch Bar

    There’s Yet Another Rant About Apple and Mac Users

    June 11th, 2018

    Over the years, some tech pundits have decided that Apple really needs to drop the Mac. To them, it has outlived its usefulness and, besides, far more money is made from selling iPhones.

    But it’s a good source of hit bait to claim that “Mac users don’t really matter to Apple.”

    Indeed, Apple has, at times, made it seem as if that claim was accurate. The Mac mini has not been refreshed since 2014. After releasing a total redesign for the Mac Pro in late 2013, Apple appeared to drop the ball and mostly abandoned that model.

    When a new MacBook Pro was launched in late 2016, some thought the claim that it was a professional notebook was a huge exaggeration. It was thinner, in the spirit of recent Apple gear, but the highly touted Touch Bar, powered by an ARM system-on-a-chip, was thought to be fluff and not much else.

    Apple also got dinged for things it had never done, such as supplying a model with 32GB of RAM. But that would have required using a different memory controller that might have impacted performance and battery life. In comparison, most PC notebooks were also limited to 16GB. A future Intel CPU update will offer an integrated memory controller that doubles memory capacity.

    Just after Christmas, a Consumer Reports review failed to recommend the 2016 MacBook Pro supposedly due to inconsistent battery life. After Apple got involved, it turned out that CR’s peculiar testing scheme, which involves disabling the browser cache, triggered a rare bug. After Apple fixed it, a retest earned the MacBook Pro an unqualified recommendation.

    Was all this proof that Apple just didn’t care about Macs?

    Well, it’s a sure thing the Touch Bar wasn’t cheap to develop, and embedding an ARM chip in a Mac is definitely innovative. But Apple’s priorities appeared to have gone askew, as the company admitted during a small press roundtable in early 2017.

    The executive team made apologies for taking the Mac Pro in the wrong direction, and promised that a new model with modular capabilities was under development, but it wouldn’t ship right away. There would, however, be a new version of the iMac with professional capabilities. VP Philip Schiller spoke briefly about loving the Mac mini, but quickly changed the subject.

    Before the 2017 WWDC, I thought that Apple would merely offer more professional parts for customized 27-inch 5K iMacs. But such components as Intel Xeon-W CPUs and ECC memory would exceed that model’s resource threshold. So Apple extensively redesigned the cooling system to support workstation-grade parts.

    The 2017 iMac Pro costs $4,999 and up, the most expensive, and most powerful, iMac ever. You can only upgrade RAM, but it’s a dealer only installation since it requires taking the unit completely apart, unlike the regular large iMac, where memory upgrades are a snap.

    Apple promised that a new Mac Pro, which would meet the requirements of pros who want a box that’s easy to configure and upgrade, would appear in 2019, so maybe it’ll be demonstrated at a fall event where new Macs are expected.

    But Apple surely wouldn’t have made the commitment to expensive Macs if it didn’t take the platform — and Mac users — seriously. The iMac Pro itself represents a significant development in all-in-one personal computers.

    Don’t forget that the Mac, while dwarfed by the iPhone, still represents a major business for Apple. Mac market share is at its highest levels in years in a declining PC market, serving tens of millions of loyal users. When you want to develop an app for iOS, tvOS or watchOS, it has to be done on a Mac. That isn’t going to change. In addition, Apple is porting several iOS apps for macOS Mojave, and developers will have the tools to do the same next year.

    According to software head Craig Federighi, iOS and macOS won’t merge and the Mac will not support touchscreens.

    Sure, the Mac may play second fiddle to the iPhone, but that doesn’t diminish the company’s commitment to the platform. But it’s still easy for fear-mongering tech pundits to say otherwise, perhaps indirectly suggesting you shouldn’t buy a Mac because it will never be upgraded, or that upgrades will be half-hearted.

    Perhaps there’s an ulterior motive behind some of those complaints; they are designed to discourage people from buying Macs and pushing them towards the latest PC boxes that, by and large, look the same as the previous PC boxes with some upgraded parts.

    But since Intel has run late with recent CPU upgrades, Apple has often been forced to wait for the right components before refreshing Macs. That doesn’t excuse the way the Mac mini and the MacBook Air have been ignored, but I’ll cut Apple some slack with the Mac Pro, since a major update has been promised for next year.

    Now this doesn’t mean the Mac isn’t going to undergo major changes in the coming years. Maybe Apple is becoming disgusted with Intel’s growing problems in upgrading its CPUs, and will move to ARM. Maybe not. But that’s then, this is now.

    Share


    Revisiting Mac on ARM

    April 6th, 2018

    I have lived through all the major Mac processor transitions. Makes me feel old. First it was the Motorola 680×0 series, followed by the PowerPC and, by 2006, Intel.

    Overall, the last one went pretty well. There was a way to run PowerPC software for a few years, courtesy of something called Rosetta. It was pretty decent from a performance standpoint, unlike the 680×0 emulator, which suddenly put you a couple of generations behind in terms of how well the apps ran until they went PowerPC. But until the new apps arrived, the all-new RISC architecture didn’t seem so impressive.

    So is Apple planning yet another processor switchover? Well, consider how Apple has managed to deliver its A-series processors with huge performance boosts every year, very noticeable with most apps.

    Compare that to new Intel processor families that might be measurably more powerful than the previous generation, but the performance advantages are often barely noticeable without a scorecard. Apple’s advantage was to create an ARM-based processor family that took direct advantage of iOS. It wasn’t bogged down with legacy support for things that never existed on an Apple platform, making for more efficiency.

    So does Apple have a Mac on ARM in its future? Microsoft tried Windows RT (on ARM) without a whole lot of success, but perhaps its second try will fare better.

    Using Apple’s Xcode, it shouldn’t be such a big deal for developers to go with the transition to ARM, and allow developers to build flat binaries for that and Intel. Recent rumors have it that you’ll be able to run iOS apps on Macs, and vice versa, more or less. The Touch Bar on the latest MacBook Pros runs with a second processor on that computer, an A-series system-on-a-chip. A similar scheme is used for low-level functions on the iMac Pro,

    So Apple is clearly taking you partway already. How long will it require for a full shift, and should you such a possibility seriously?

    It’s a romantic ideal, that Apple has full control of more and more of the parts that make up its hardware. It would also allow the Mac to offer far more differences than just a higher-priced PC in a fancy box.

    According to recent reports from reporter Mark Gurman of Bloomberg, the prospective shift may happen beginning in 2020. Take it with a grain of salt for now.

    But can an iPhone or iPad chip really power a Mac with equal or better performance than current models? Consider the benchmarks that show Apple’s mobile hardware exceeding the performance of most notebook PCs and coming up real close to the MacBook Pro. No doubt those CPUs are not running full tilt to lower the drain on resources and battery life. What will those benchmarks be if Apple allowed them to run full bore?

    What about the chips shipping two years from now? Remember, too, Apple already has control of graphics hardware, so what happens to its existing partners, AMD and NVIDIA? Apple probably wouldn’t care if its taking these steps.

    It wasn’t so easy for Apple to persuade developers to adopt PowerPC, but far easier to go to Intel, since there was so much legacy software on the Windows platform. That meant that many developers knew how to optimize their Mac apps for Intel. As I recall, it wasn’t such a difficult move.

    But there was one key advantage of Apple going Intel, other than being assured of regular improvements, more or less, in the chips. It was the ability to run Windows natively with Boot Camp, and at pretty good speed with virtual machines courtesy of such apps as Parallels Desktop.

    If Boot Camp and virtual machines have to run in emulation on one of these new fangled Macs, how much would performance deteriorate? Or would Apple devise ways to work around this, such as licensing some Intel chip functions using the graphics hardware to reduce the performance bottleneck? I would be loathe to predict how it could be done, but if the ARM chips end up significantly faster than Intel counterparts, maybe most people won’t notice much of a difference.

    It wouldn’t take the infamous performance hit of running Windows under emulation the PowerPC. That was just dreadful. I remember opening a document would often take a full minute or two.

    Some suggest that Apple, which has often ditched older technologies without apology, might just give up on the concept of running Windows on a Mac. But I suspect lots of users still need that feature. I also suspect that Apple is quite capable of devising a solution that wouldn’t hurt performance in any particularly noticeable way.

    But this all needs a reality check. That Apple could make this change doesn’t mean it will. It might very well be that Intel’s existing hardware roadmap is a viable solution, without saddling Apple with the development costs of a new processor transition. But there are good reasons for consistent hardware across its major platforms. If the annual improvements in Apple’s A-series CPUs continue to provide healthy two-digit performance boosts, maybe it will happen after all.

    I’m skeptical, but with Apple, never say never, particularly if Intel confronts any serious headwinds in improving its chips going forward.

    Share


    Fact-checking Consumer Reports and its 2017 VIZIO M-Series Review

    January 5th, 2018

    Despite its pretensions of factual and technical accuracy, I’ve long had issues with the way Consumer Reports magazine manages its reviews. A notable example is the curious way in which notebook computer battery life is calculated. When CR first reviewed Apple’s 2016 MacBook Pro with Touch Bar, it wasn’t recommended due to reported battery life inconsistencies.

    Apple quickly responded and, working with the magazine, found that the CR’s test methodology, requires turning off browser caching, thus using Safari’s Develop menu in macOS Sierra. It helped trigger a rare bug that virtually nobody would see under normal use. But Apple fixed the bug anyway, and the magazine reversed its non-recommendation. Still, nobody expects the MacBook Pro to deliver over 13 hours of battery life over regular use.

    Without going into detail, when I’ve checked the reviews of cars I’ve owned and/or driven I’ve run across occasional stark differences in the way features were described. The thin side bolsters of the leatherette seats of recent VW Jettas, for example, are not thin but thick and really holds you in place around a sharp curve. Or perhaps there was a manufacturing change of some sort after CR did its review.

    So we come to a section containing TV reviews, and I decided to compare CR’s rating to what I achieved in evaluating a 55-inch 2017 VIZIO M-Series. Whereas the set has received really high scores from CNET and other sources due to its combination of high performance and a low price, CR’s rating was a decidedly mediocre 62 compared to similarly-sized models that earned scores of up to 88. To be fair, the highest rating includes some decidedly pricier sets including OLED models from LG.

    Now in comparison with other reviewers, CR buys all the sets it covers anonymously from regular retail establishments. Supposedly it makes the publication incorruptible because the manufacturer can’t send a ringer, a specially adjusted sample that would deliver better performance than the shipping product. Not that I ever encountered anything of the sort in the 25 years I’ve been reviewing tech gear, but I’ll grant is may be possible.

    Regardless, CR is entitled to its priorities, but not its facts.

    Just where did the VIZIO fall down? Well, it’s downgraded in several areas, such as HDR “effectiveness,” and the supposedly “limited viewing angle” that actually scores as “good.” Go figure!

    In addition, the lack of a TV tuner is criticized, but there is also a peculiar conclusion, that the “required tablet-control device [is] not included.”

    I understand where the lack of a tuner might be important for some people. This design decision is clearly intended to cut costs and not have people pay for a feature they aren’t going to use. Tuners are available from Amazon for around $30 or so, and thus it shouldn’t be an issue. In the future, there will be an updated broadcast standard, ATSC 3.0, supporting 4K and other features. So when they arrive, you’ll be able to buy one without being saddled with a TV that has the older hardware. I suppose that’s a way to future proof.

    The claim that a tablet is required is simply not true, although a mobile device will help expand the built-in Google Chromecast feature. As it stands, the set ships with a small number of preloaded apps that include Amazon Prime Video, Hulu and Netflix. This selection probably accommodates most users. For those who want YouTube and thousands of other services, you can pair the VIZIO with an iOS or Android device with the company’s SmartCast app.

    No tablet required. On the other hand, since the CR review is based on an older firmware version on its test VIZIO, perhaps these apps were added in a subsequent update.

    CR also claims there is no Internet capability, but since the set offers both an Ethernet port and Wi-Fi connectivity, and uses the Internet to receive streams for its embedded apps, the statement is just not true. Thus the review is a little jumbled. Maybe CR is reviewing so many sets, it just can’t get all its facts in order.

    In other review categories, Ultra HD (4K) performance is rated as “very good” largely because of less-than-stellar upconversion from HD-to-UHD. Since most of what you’ll be watching on such a set is HD, this process is of critical importance. For me, cable reception is clearly better than on my previous TV, a 2012 VIZIO E-Series, but edges, particularly lettering, are sometimes jaggy if you look real close. Otherwise it’s not going to be much of an issue.

    That said, I was particularly interested in the “Optimized Picture Settings” that were obtained in CR’s test laboratory. In brief, they were actually quite close to the Calibrated Dark settings achieved by CNET. Compared to the Calibrated setting, the backlight is turned way down. Consistent with other reviews, sharpness is set at zero. It’s generally felt that set manufacturers make the edges just too sharp, perhaps to make a better impression when customers do comparisons.

    So I switched to the CR settings, which took maybe a minute or two, and turned off Auto Brightness, which I had been experimenting with. So far there is a definite if slight improvement in color rendition, particularly flesh tones. That’s probably the result of a somewhat higher color setting, plus changing the set’s gamma from the default 2.2 to 2.4.

    Overall, the CR review seems fairly consistent with some of the results I achieved, but I’m concerned at the contradictions and clear errors in some ratings categories. This demonstrates a lack of attention to detail, or perhaps the editors are so reliant on boilerplate templates for reviews that they were a little careless in the final editing process.

    Regardless, as my review progresses, I continue to enjoy the rare selection of true 4K content that reveals the sets superior picture in all its glory, and even the standard HD fare from the cable company looks a whole lot better.

    Share


    A New Slant on Universal Apps?

    December 22nd, 2017

    Many of you have heard of the term “fat binaries,” or “universal apps,” in which the code will work on more than a single platform or product. It may even offer a different look and feel depending on the needs of that product.

    So when Apple went to Intel CPUs beginning in 2006, they embedded a built-in emulator, dubbed Rosetta. for Power PC apps for several years. That way, you didn’t have to wait for developers to build compatible software. Developers could also serve both users by building apps that combined both Power PC and Intel code, so-called universal apps. It meant for larger downloads, but at least you were assured of a version that would work even on your older Mac.

    Eventually that all went by the wayside; Apple made Rosetta an optional install for OS X Snow Leopard. Support was removed beginning with OS X 10.7 Lion. After that it was Intel or nothing.

    Microsoft has supported fat Windows 10 binaries that run on regular PCs, tablets and, while they lasted, mobile gear. But with the constrained resources of a smartphone, it would seem wasteful, unless the download process strips the unused code from the binary.

    Now there are published repots that Apple is planning on taking a similar approach with its current gear. So developers may be able someday to create one universal app that works on an iPhone, iPad and, yes, a Mac. The story comes from Bloomberg, which has a less-than-stellar reputation for accurate reporting about Apple, but it posits some intriguing possibilities.

    As it stands, many iOS apps are universal in that they are optimized for both iPhone and iPad, with their very different display sizes and feature optimizations. That makes perfect sense, as does automatically stripping an installer of unneeded code to keep the download size as small as possible. It also simplifies the development process.

    The theory from the Bloomberg blogger goes that, if developers can build one version of an app for the three Apple platforms, more software will be available in the Mac App Store. I suppose that means the selection could be larger.

    But Apple’s own sandboxing restrictions already limit the kind of software available for iOS and macOS. So, for example, Rogue Amoeba’s Audio Hijack, used to capture and mix audio from a number of sources, wouldn’t be approved for Apple’s online software repositories. I should think ways can be found to ensure security in pushing audio from one app to another, but I don’t claim to be a developer.

    In any case, the article cites unnamed sources in claiming the existence of what is being called Marzipan. But that doesn’t imply that it’s an official source. It may even be that some developers would like for something of this sort to happen. Then again, such a move would have no meaning for most people outside of the developer community.

    Besides, it’s not the same thing as running an iOS app on a Mac. That’s already done in emulation in Xcode so mobile apps can be developed, but as a practical matter, many iOS apps are limited-purpose or otherwise restricted compared to their macOS counterparts. Very much of this is due to constrained resources and the requirements of Apple’s mobile hardware.

    If true, this move has the potential of making it far easier for developers who have produced millions of iOS apps to embrace Apple’s traditional computing platform.

    It would also demonstrate is Apple’s ongoing commitment to the Mac, something that was a little questionable in 2016, when only one product, the MacBook, received an update until fall, when the controversial MacBook Pro arrived. And some people weren’t even satisfied with that.

    Yet another suggestion is that Marzipan is only the first step towards running macOS on Apple’s custom A-series silicon. Apple is already offering such chips for specialized tasks, such as the MacBook Pro’s Touch Bar and Touch ID implementations, and low-level functions on the iMac Pro.

    But they aren’t intended, so far at least, to replace Intel’s CPUs. But by offloading certain functions to Apple designed CPUs, it may make way for better performance, and to build Macs with features that no PC maker can easily duplicate.

    Is that the first step towards a wholesale chip migration? I suppose you can romanticize the idea, that today’s A-series CPUs can essentially match or at least approach Intel silicon in many performance parameters. Remember, too, that even today’s A11 Bionic chip, used on the iPhone 8 and the iPhone X, are probably not run at full bore because of the resource constraints of mobile hardware.

    I don’t think it’ll happen in the foreseeable future, unless Intel falls down bigly in developing new Core chips. It’s not a matter of performance. It’s not a matter of being able to ease migration with an Intel emulator that offers decent performance. But it would also handicap the ability to run Windows natively on a Mac with Boot Camp, or with really good performance in a virtual machine. Apple would have to develop CPUs with much faster performance than Intel offers to be able to overcome the losses entailed in emulation.

    But building a universal or fat binary shouldn’t represent a huge problem, if such a move makes any sense for developers, and, of course, to Apple.

    Share