Report: Apple Could Release Four Macs with M3 Chips in 2023

Posted on June 27, 2022 by Laurent Giret in Apple with 27 Comments

The just-released 2022 13.3” Macbook Pro is the first Mac powered by Apple’s brand new M2 chip, and it will soon be followed by a redesigned 13.6-inch MacBook Air next month. If there was a two-year gap between the first M1 Macs and the first M2 Macs, we may see the first M3 Macs ship in 2023, according to Bloomberg.

In the latest edition of his Power On newsletter, Mark Gurman revealed that we may see a “deluge” of new Apple products between the fall of 2022 and the first half of 2023. Later this fall, Apple is expected to launch four iPhone 14 models, three new Apple Watch models including a rugged edition for extreme sports fans, and a new entry-level iPad with a USB-C port and optional 5G connectivity.

Apple is also expected to bring its new M2 chips to several new products this fall, including two new 11-inch and 12.9-inch iPad Pro models. The M1 Mac Mini released alongside the M1 MacBook Pro and MacBook Air may also be refreshed with an M2 chip this fall, but Gurman wrote that an M2 Pro model is also in the works.

Gurman expects the regular M2 chip to power Apple’s first AR/MR headset, which the company could announce in January 2023. In addition to the new headset, Apple is also planning to release M2 Pro and M2 Max 14-inch and 16-inch MacBook Pros, in the coming months, as well as a redesigned Mac Pro powered by M2 Ultra and a new M2 Extreme chip.

That’s a lot of M2-powered devices, but Apple is already working on an upcoming M3 chip that could make its debut in four Mac models, including a 15-inch MacBook Air. According to Gurman, “the company is planning to use that chip as early as next year with updates to the 13-inch MacBook Air code-named J513, a 15-inch MacBook Air known as J515, a new iMac code-named J433 and possibly a 12-inch laptop that’s still in early development.”

Overall, Gurman believes that Apple is “about to embark on one of the most ambitious periods of new products in its history,” which is quite exceptional for a company that usually slowly iterates. However, there’s still a possibility that the ongoing chip shortages may impact Apple’s product pipeline one way or another.

Tagged with , ,

Join the discussion!

BECOME A THURROTT MEMBER:

Don't have a login but want to join the conversation? Become a Thurrott Premium or Basic User to participate

Register
Comments (27)

27 responses to “Report: Apple Could Release Four Macs with M3 Chips in 2023”

  1. spiderman2

    So you mean they're releasing a new gen of CPU per year? wow that's something that no one (Intel, Amd, Qualcomm...) ever did before!

    • bluvg

      Sarcasm? The major version number increment doesn't correspond with major changes. M2 was a pretty minor update over M1 CPU-wise.

      • spiderman2

        yup, I mean every year intel present a new i gen, and AMD a new ryzen series ... that's it

      • michael_babiuk

        Interesting definition for a "minor" update. According to TechRadar (and other publications), the M2 enjoys a 25 to 35 percent better graphic performance over the M1. The M2 enjoys roughly 18 percent better multi-threaded CPU performance over the M1 and it's neural engine runs "40% faster than the M1. Plus, the M2 features 50% more memory bandwidth and it has a 25% increase in transistor density, 50% greater unified memory and ... on and on and on. Yup, "yada yada yada". PC guys like specs all but all I know is that the BASE M2 is a very good upgrade over the M1 chip and I would not classify it as a "minor upgrade". Can't wait to see what the Pro and Max variations of the M2 chip will offer.


        But you are correct. The M2 will seem like a minor upgrade when compared to a die shrunk M3 chip design. I would not bet against the M3 having a very noticeable performance jump over the M2.

        • wright_is

          And the media cores, which were restricted to M1 Pro and above, for accelerated h.264 and ProRes encoding/decoding, effectively freeing the CPU and GPU for other things, whilst the video is being encoded.

        • bluvg

          "that's something that no one (Intel, Amd, Qualcomm...) ever did before!"

          "a pretty minor update over M1 CPU-wise."

          "Interesting definition for a 'minor' update"


          Benchmark results are showing not much faster M2 vs. M1 in single-threaded workloads, up to 16% (close enough to Apple's 18%--up to 18%, rather) in multi-threaded workloads. But those are exactly the kinds of CPU results Intel et al have shown--and sometimes much better (e.g. 2021 Alder Lake is dramatically faster than 2020 Tiger Lake). These are not the considerable gains many were breathlessly predicting about the M2 over the M1.


          Don't get me wrong, Apple silicon is very impressive, especially in terms of efficiency (!). But as with all things Apple, there's always the more pronounced hype vs. reality dynamic and sliding scales when it comes to hard metric comparisons.

          • wright_is

            Yes, but you are talking about synthetic (cross-platform) benchmarks. Also check out René Richie's Benchmark LARP video, very informative.

            https://www.youtube.com/watch?v=dOuumvkwX1o


            You need to look at real-world workloads to see where the difference is. It is the offloading of specialist tasks to dedicated processing units where the real performance gains come in. Apple is making the silicon specifically for its needs and the needs of its key userbase. Intel makes general purpose processors and general purpose benchmarks are a good comparison between its chips, and AMD.


            But these standard benchmarks don't take into account where Apple is investing its time and effort into tweaking performance. More informative are real-world benchmarks, like the recent ones by c't in Germany, where they actually did real-world testing on DAW, photo editing and video editing, to see where the M1 Max and M1 Ultra really were faster and where Intel was still faster.


            Essentially, if you are running legacy software, like the Adobe Creative Cloud, which is so old, it doesn't really take advantage of multiple cores, you are better off going with an Intel workstation over the M1 Ultra, but if you are using modern software, like Serif's Affinity suite, the M1 Ultra is much faster, because it is optimised for more cores. The same with DAW, some of the older legacy suits, like Cubase would top out the M1 Ultra at about the same rate as a Core i9 12th generation, but Apple's Logic Pro, and I think BigWig Studio were several orders of magnitude faster than the Intel chips and standard M1, because the software was optimised for multiple cores.


            If the benchmark is just testing raw math and single cores, yes, the improvement is minimal, but not many users run math benchmarks or graphic benchmarks all day long (and the graphic benchmarks have to use off-screen rendering to even load up the GPUs, see the link to Rene's explanation of Benchmark LARP above). But users who do video and graphics editing or make heavy use of the neural engines will find a huge improvement in performance, over the standard M1 - if you are a heavy duty user, you will still be better of with an M1 Max or M1 Ultra.


            Although the base 256GB SSD model M2 MBP seems to be slower than the M1 in data transfer. It looks like they went with a single 265GB chip this time round, which works out slower than the previous generations 2x128GB - parallel writes of both chips are faster than serial writes on the single chip.

      • ebraiter

        So the same vulnerabilities in the M1 chip are in the M2 chip. :-)

        • wright_is

          Yes, if you have the time to use the exploit code, it brings them back down to the same level of security as every other ARM chip or Intel and AMD chip...


          It is like Meltdown and Spectre, interesting, a genuine flaw, but if you are in a position to use it on an end-user machine (as opposed to a cloud VM host), you don't really need to use the exploit, because you already have access to the data on the machine.

        • bluvg

          That seems likely, though they're not very serious vulnerabilities.

    • rob_segal

      Intel couldn't deliver the kind of chips Apple wanted or on the timeline Apple needed. Apple believed they could build the chips they needed, not Intel, AMD, or Qualcomm and they were proven right.

      • spiderman2

        I just said that every year Intel, AMD, Qualcomm ecc... provide a new gen of their CPUs that's it. But I love how apple fanboys need always to spend few good words for their beloved company

      • bluvg

        ...for now. People sang Apple's praises for their switch to IBM initially also, but that soured later. They weren't in nearly the same position they are now, though. Their mature mobile silicon market is driving their Mac silicon is driving their mobile silicon. But Intel is also claiming they will reach performance and efficiency parity with Apple by 2024/2025 (and no, not with where Apple is now, but where Apple will most likely be then, which is highly dependent on TSMC, which publicly discusses its roadmap).

        • wright_is

          The one interesting thing is, Apple are making iOS and macOS specific processors designed to carry out the processing loads that their users will be doing most.


          That means they can leave out a lot of the general purpose fluff (saving gate count and energy) and add in extra units that perform specific tasks in a very optimised way.


          Intel did something similar with its GPU, adding in a dedicated HD video encoder, which actually ran rings around ATi and nVidia graphics cards at the time (late 2000s), but ATi/AMD and nVidia quickly caught up after a generation or two. But Intel are still building general purpose chips that have to do everything for everybody, so they are not as optimised as they could be - one of the reasons why their chips are so power hungry, in comparison to the M-series chips.


          Qualcomm also suffers from this, as do most other ARM designs. They are making chips for many customers with many different needs, so they can't be tailored to the platform they will be used on.


          That is Apple's advantage at the moment. The question is how long this will keep them ahead. With energy costs becoming a critical factor in pricing up a new system these days, we should see Intel and AMD up their came in the efficiency stakes.


          It is certainly one of the reasons why I went with an M1 Mac at the last upgrade, as opposed to upgrading to a newer Ryzen 7 desktop. Likewise I'm also using a Raspi 400 a lot now, for simpler tasks.

          • Scsekaran

            That could be a disadavntage in the long term. When they design a CPU specifitc to their OS/Software/Codec, the advantages will be limited to those specific situations.

            • wright_is

              It is the same problem that Microsoft has in the software world.


              Whilst Apple can tell their customers, that they are dumping old frameworks or moving to a new hardware platform, Microsoft is so beholden to its corporate customers, it just can't do that, without alienating them.


              Intel has a similar problem, there is a lot of legacy stuff built into the chips and there are lots of bits of silicon on those chips that are never used by many operating systems, they are "special cases", that a small number of customers need.


              That all adds up to a large number of gates that have to be powered, but never actually do anything during the whole life of the product. This so-called "dark silicon" still draws energy, meaning that the chips can never be really efficient, as long as the legacy crud or special case hardware can't be banished.


              Intel (and AMD) could simplify their designs by declaring legacy blocks deprecated and remove them, or they could make special versions of the chips with the special-case stuff in them. But that isn't economical, from a design, production run and testing points of view, so the kitchen sink gets packed in with everything else, "just in case".


              That is Apple's advantage, they strip out the legacy cruft & special cases that will never occur, because Apple designs the OS around the hardware and the hardware around the OS, so they can mutually improve the efficiency of the hardware and the OS.

  2. ebraiter

    I don't even bother with these "reports". Until it is officially announced or is expected [i.e. a new iPhone series every fall], who cares.

  3. Brazbit

    Or they could not...


    (Sorry, I miss Paul's short takes.)

    • jgraebner

      Yeah, I've been surprised to recently see Thurrott.com using headlines in this style, since Paul has made fun of them so much over the years.

  4. johnnych

    Apple is on a roll now with these new machines and hardware, they don't have to depend on anyone else to complete their goal and vision! The 14 inch MBP is one of the best machines I have purchased to date and it was just the first version of the Apple Silicon chip. Hopefully they continue to make products that the customers actually want and are asking for.

  5. Donte

    Mac sales are higher than ever, but in my experience its Intel Mac users gulping up the kool aid and getting on these M versions. At my company we have bought a lot of new Mac's in the last year, but none of them were net new, just Intel Mac replacements.


    It will be interesting to see how sales go after the Intel Mac's are not supported any more.

    • spiderman2

      Of course they're, Mac users must move to ARM cpu before they'll get cut off... we'll see in future

    • rob_segal

      It's not kool-aid. The M1 chips are amazing. Apple hit a home run with its performance per watt. Intel Mac users should upgrade to Apple Silicon Macs. It's a no-brainer decision.

      • nbplopes

        Indeed was a very very noticeable performance, efficiency and responsiveness jump from Intel based Macs and the M1.


        When it came out, read a bunch of articles arguing otherwise, comparing it to new Intel based laptop systems ... but the thing is ... denial dies not shed light on the truth. Active experience does.

      • bluvg

        It's a no-brainer in many cases (and obviously Intel won't be an option in time), but some should wait a bit longer. There are still janky compatibility issues with some software, though that will surely get sorted in time.

      • Donte

        If performance per watt is important to you then yes they are great. Or for specific work loads like anything that uses the built in codecs, which is basically video editing software that has been ported like FinalCut.


        Outside of that they are no better at web browsing, YouTube watching, word processing, email or removing red eye from a picture taken with a smartphone. My 16 Intel MacBook was replaced with a M1 pro version. Thing I notice most is the extra weight and thicknes. I do not do video work or am I on battery for more than a few hours.