OpenELM: Apple's New Open Source LLM (OpenAI Competitor?)

Поделиться
HTML-код
  • Опубликовано: 26 апр 2024
  • Apple's recent commitment to open source has been wild to watch. OpenELM and Corenet are unexpected additions to the OSS world, curious to hear how y'all feel about it!
    SOURCE
    github.com/apple/corenet
    huggingface.co/collections/ap...
    Check out my Twitch, Twitter, Discord more at t3.gg
    S/O Ph4se0n3 for the awesome edit 🙏
  • НаукаНаука

Комментарии • 212

  • @4.0.4
    @4.0.4 Месяц назад +431

    Can't wait for the trailer where Apple invented LLMs.

    • @RickySupriyadi
      @RickySupriyadi Месяц назад +1

      LOL

    • @_beyi
      @_beyi Месяц назад +10

      closed source apple llm: COSTS $5199

    • @brainites
      @brainites Месяц назад +1

      @@_beyi 🤣

    • @EnricoGolfettoMasella
      @EnricoGolfettoMasella Месяц назад

      And after all the hype and excitement of the newbies believing it!

  • @GyattGPT
    @GyattGPT Месяц назад +63

    Apple engineers working hard on the weekends...

    • @MrHolesVids
      @MrHolesVids Месяц назад

      @@liquidsnake6879 I'd find a new job mate!

  • @monad_tcp
    @monad_tcp Месяц назад +110

    When the all opensource LLMs combine like a megazord, then OpenAI moat will be drained.

    • @greenaware8580
      @greenaware8580 Месяц назад +3

      Bro.wats a megazord

    • @mikopiko
      @mikopiko Месяц назад

      @@greenaware8580 If only there such a way to search for such information, hmmm

    • @Redditard
      @Redditard Месяц назад

      ​@@greenaware8580 You don't want to know
      www.google.com/search?q=megazord

    • @tvnishq
      @tvnishq Месяц назад

      ​@@greenaware8580power rangers

    • @edumorangobolcombr
      @edumorangobolcombr Месяц назад +1

      What’s a Bro?

  • @headbangingidiot
    @headbangingidiot Месяц назад +25

    Llama 3 was also trained on publicly available information

    • @v_6302
      @v_6302 Месяц назад +1

      The training cost of llama cost meta like 2 million and they accidentally leaked it. Now most every open source model is based on llama beeing finetuned (1% of training cost). The data isnt the problem, its the infrastructure and cost of training.

  • @zynthssam7461
    @zynthssam7461 Месяц назад +61

    A little note on the Llama 3 model card: "Neither the pretraining nor the fine-tuning datasets include Meta user data"
    So the argument on the safety Apple's models over Meta's is not completely valid.

    • @jkelley2841
      @jkelley2841 Месяц назад +18

      To be fair, if you dive into the current state of llm research, user data is pretty much meaningless, believe it or not. Don't let a lot of these "scaling laws" confuse you. These models require the highest quality data. It's actually insane the amount of effort that went into something like gpt. There's a great article on this by an open ai employee: "The “it” in AI models is the dataset." However, amazing data is expensive. Meta doesn't use your facebook posts cuz they suck... not cuz of ethics lmfao. I love zuck currently but man THEY WOULD IF THEY COULD. The push rn is towards synthetic data for this reason. There's even starting to be momentum towards dropping out facts completely like person x is born on this date because they will be able to look about and verify details like this. (perplexity ai). And so these models can generate more synthetic data less reliant on facts and on train of though.

    • @Kris96431
      @Kris96431 Месяц назад

      You believe them 😅

    • @zynthssam7461
      @zynthssam7461 Месяц назад +1

      @@jkelley2841 You are so correct.
      I didn’t even know about that last point you made about fact dropping but it makes a lot of sense to “outsource” facts to focus training on reasoning rather than memorizing.

    • @gro967
      @gro967 Месяц назад

      It’s Apple, they are just as evil as everyone else, but they have better marketing…

  • @flwi
    @flwi Месяц назад +7

    Very interesting. Never thought apple would do this. Reminded me of the moment where Micro$oft announced typescript, vs-code and the language server.
    I'd be interested in a video about running models locally!

  • @weeee733
    @weeee733 Месяц назад +6

    Yes, cover more these please. First video seen at this channel and you gained a sub

  • @v_6302
    @v_6302 Месяц назад +13

    Most LLMs are trained on open source data (the internet) including some private data. The problem of siri isnt that the data was bad. Its that its tiny and based on old AI models (i dont even know if siri is even ai, it behaves more like a conversation bot, with preconfigured questions and answeres).
    The difference between OpenAI and apple is the cost for training these models. llama from meta cost like 2 Million and was just accidentally leaked.

    • @msclrhd
      @msclrhd Месяц назад +2

      Expert systems (with a question-answer style decision tree) is an older branch of AI research. Neutral Networks (NN) and Machine Learning (ML) are another branch. LLMs build on the NN/ML research side of AI.

    • @rollbacked
      @rollbacked Месяц назад +4

      siri is essentially the equivalent of a voice based menu system, the only AI thing about it is its synthesized speech

    • @GrantCelley
      @GrantCelley Месяц назад

      Siri was most likely like Rasa

  • @shiaulis
    @shiaulis Месяц назад +13

    Apple usually doing such release 1-2 months prior to WWDC. The next WWDC is in a month. I don’t think those releases anyhow connected to their stock prices

  • @LeandroAndrus-fn4pt
    @LeandroAndrus-fn4pt Месяц назад +4

    Apple obviously wants Mac to be the primary AI development platform and they want Apple silicon neutral engine to be the most compatible with the most popular neural networks…

  • @undifini
    @undifini Месяц назад +3

    The 4 sizes of open-ELM might be for watch, older iPhones, newer iPhones (A18+), and macs/M-powered iPads respectively

  • @aaron_poppie
    @aaron_poppie Месяц назад +2

    I have been using LM studio for months now. So much value in being able to run your own models. Would love to see more AI content, by following your tutorials and using AI I was able to build my first React app. A year ago I had never really coded. Its a whole new world!!!

  • @mchisolm0
    @mchisolm0 Месяц назад

    Thank you for this. Very cool to see a privacy focused company through their with publicly into open source more.

  • @JavedKhalil
    @JavedKhalil Месяц назад +6

    Yes do more LLMs stuff - its interesting and seem to be going somewhere and there are hidden mysteries to reveal there.

  • @devanfarrell16
    @devanfarrell16 Месяц назад +2

    I’d love to hear you cover this stuff more. It’s hard to keep up with an I feel like it’s in our best interest as engineers to know what’s going on even if that isn’t our specialty at this point.

  • @captainobvious1415
    @captainobvious1415 Месяц назад +1

    I’m wondering how the new null of non-compete contracts will effect the development of AI’s in other companies

  • @andre-le-bone-aparte
    @andre-le-bone-aparte Месяц назад +1

    Theo: @8:50 I didn't expect this, how's their stock doing?
    Roaring Kitty: I LIKE THE STOCK

  • @Patrick-pw1cr
    @Patrick-pw1cr Месяц назад +1

    corenet seems to have a focus, neural net inferencing optimisations on mobile apple silicon...this could be the first visible effort to deliver a true ai assistant / agent with every iphone

  • @a7zz
    @a7zz Месяц назад

    I think it would be really cool to see videos about building & deploying custom pipelines (to GCP? Azure? AWS?) using oss models

  • @m12652
    @m12652 Месяц назад

    Brilliant, more please. I was getting bored of hearing about AI till this popped up 👍

  • @gro967
    @gro967 Месяц назад

    Please cover more of the Microsoft internal AI stuff, that is just as interesting as what OpenAI does.

  • @Embassy_of_Jupiter
    @Embassy_of_Jupiter Месяц назад

    My first thought was that they'll probably use it for spell checking and autocomplete. I don't know what else such a small model would be useful for. maybe for correcting erros in speech recognition?

  • @josephlabs
    @josephlabs Месяц назад +1

    These moves are peaking my interest for WWDC. I always wished that Apple improved training on their M chips, inference is good but training is how progress I made.

  • @FunwithBlender
    @FunwithBlender Месяц назад

    I think you meant to make a clear distinction between open source vs open weights

  • @_BonsaiBen
    @_BonsaiBen Месяц назад +1

    “Mistral mending their way to the top” 🤣🤣🤣

  • @karmatraining
    @karmatraining Месяц назад

    This is indeed very interesting.

  • @znewt99
    @znewt99 Месяц назад

    I thought Apple mainly worked on Thunderbolt 3 not necessarily the USB C standard. Just how to transition Thunderbolt from mini DisplayPort to USB C

  • @andrewnageh3163
    @andrewnageh3163 Месяц назад

    which browser u are using?

  • @MeinDeutschkurs
    @MeinDeutschkurs Месяц назад

    I think these models are supposed to wire text for Siri AI. Siri could research over the web or elsewhere, providing the result to the LLM, and reading the output. The model should be just able to write proper language.

  • @saaaamiimb
    @saaaamiimb Месяц назад

    Interestingly enough the corenet repo has a section for “For macOS, assuming you use Homebrew”.

  • @mynameajeff7459
    @mynameajeff7459 29 дней назад

    definitely make more content about this stuff

  • @dehrk9024
    @dehrk9024 Месяц назад

    Theo's yapping is my goto to keep my brain sharp on suundays xD

  • @jhonny6382
    @jhonny6382 Месяц назад

    Given the sizes seems like the idea is to have the models running locally on Apple devices, you could probably run this without any problem on an iPhone 15, 240M is tiny, so I'm not really sure if it is going to be a decent model

  • @AndieKobbie
    @AndieKobbie Месяц назад +2

    Yes please, talk about this more, it is really fascinating and I am looking forward to playing with it myself.

  • @cariyaputta
    @cariyaputta Месяц назад

    Will it be useful for Android?

  • @JulesPatryLawl
    @JulesPatryLawl Месяц назад

    Make more content about this please!

  • @chipChipper375
    @chipChipper375 Месяц назад

    Can you please make a video on 'how to develop standard & structured web application'

  • @lilyzheng2322
    @lilyzheng2322 Месяц назад +4

    I think they are treating devs differently

  • @Praxss
    @Praxss Месяц назад

    wwdc gonna be interesting this year

  • @vostfrguys
    @vostfrguys Месяц назад +9

    I knew by some interviews that apple were very good about AI, I didn't they would open sourced stuff and actually they do open source stuff but it's not breaking stuff

    • @Mitch-xo1rd
      @Mitch-xo1rd Месяц назад +3

      I mean, I prefer Apple funding open source ai research to Microsoft tossing billions into the closed source money hole of Openai.

  • @OsvaldoGago
    @OsvaldoGago Месяц назад

    What about the model performance? How good it is? Because whiteout that in the video we can't really know how valuable this tool is, except that's made by Apple.

    • @v_6302
      @v_6302 Месяц назад

      The models capabilities roughly scale with the amount of parameters (though there have been some breakthroughs). These models are likely less capable then a 7B parameter model like llamas 7B model. For comparidon all big players like gpt4, gemini etc are more then 50B parameters.
      They are likely pretty shitty...

  • @coolbeatguy
    @coolbeatguy Месяц назад

    I'd love to understand how this stuff works, by someone just going through the code and explaining it

  • @zachwak
    @zachwak Месяц назад

    I like AI content from level-headed, security minded, engineers like you! Cover more when applicable!

  • @znewt99
    @znewt99 Месяц назад

    Apple does not need to pay Google to use the same encryption standard for RCS. Google uses the Signal protocol which is an open source protocol from Signal

  • @dmug
    @dmug Месяц назад +3

    I’m hoping Apple will realize starving their devices and computers of RAM wasn’t a good idea now that AI wants all the RAM…

    • @everyhandletaken
      @everyhandletaken Месяц назад

      This. It was shortsighted to want the extra money for more RAM and then want people to buy into all this.
      Not just RAM either, same thing with storage too.

    • @ZeerakImran
      @ZeerakImran Месяц назад +1

      @@everyhandletakenit allows them to sell the laptops for cheaper to people who otherwise couldn’t afford it. They also function really well and are underpriced compared to the competition. I know. Hate me.

    • @everyhandletaken
      @everyhandletaken Месяц назад

      @@ZeerakImran their upgrades are outrageously priced though & I know this because my employer just got me a new MBP M3. Going from 512gb to 1tb of storage was crazy enough, but 1gb to 2tb wasn't even worthwhile. An M.2 2tb is less than 1/6th of their price & that was an upgrade price from 512gb, which is already in the base price!
      This MBP was almost 6x the cost of my wife's new Lenovo laptop & whilst I love it, it is already so expensive, without gouging people for upgrades. It should have had at least 2tb of storage and 48gb RAM for that, instead of 1tb / 36gb, for the price.

    • @ZeerakImran
      @ZeerakImran Месяц назад +1

      @@everyhandletaken again, they're underpriced to begin with and for anyone who buys the upgraded models, they can make back 100x the cost of the laptop plus the macbook will last 10 years compared to a windows device which will either last 3 years or have you wish it did. When comparing the prices, careful not to compare the price of a macbook with other hardware manufacturers like asus...With them, all you're buying is a laptop with generic hardware (not designed by them) connected together. Finding a windows laptop which makes you not want to bring a mouse with it alone is difficult enough. That makes all of those laptops overpriced considering they're shitty laptops even at 1500+ with no battery or power without being plugged in. Finally, when buying from apple, you're paying the company to continue to do what it does, not to buy the macbook hardware alone. Asus is not responsible for the future development of the OS whereas with Apple, you expect improvements and software upgrades free for the next 8 years. The R&D and marketing budget for that device which is the reason why we're all discussing it to begin with. The apps that come with it (which a lot of users prefer over alternative monthly paid solutions each). The rent of all the apple stores you can go to visit when purchasing the device, when the device has an issue, the fact that Apple has the best hardware reliability out of the 4 largest manufacturers with Microsoft hardware being 4X worse. Take into account how much you value your data, privacy and time. The law battles apple has to fight to preserve some rights regarding your privacy from the government and intelligence agencies. Its the only company that has stood up and actually fought and suffered behind the scenes no doubt to protect its user's to the best if its ability. Of course, there can be no such thing as privacy if your user base is large enough. You will be made to comply or your company will be abused legally to the ground. Now comparing that with Asus who all they had to do was put a couple ddr4 ram modules on a generic motherboard, its not entirely fair because we don't get any of these things in return for it. Just the laptop with a windows license which pays for itself with data collection and ads. That expensive ram upgrade is what's paying the designers and for the development of say the M line of apple silicon. Intel and snapdragon still haven't caught up years later.

    • @everyhandletaken
      @everyhandletaken Месяц назад

      @@ZeerakImran 😂

  • @johnnyappleseed6988
    @johnnyappleseed6988 Месяц назад

    Hmm I am with you on not seeing this coming at all. I am extremely curious to see what additions this brings to Apple's ecosystem. Like you said running this smaller model should be able to run on device which if I can finally use Siri and its more helpful than frustrating, it would be nice. Currently what is your favorite model/s to use yourself?

    • @4.0.4
      @4.0.4 Месяц назад

      Llama-3 8B instruct is pretty awesome for its size, especially after people tuning it, like dolphin. Bigger ones are great but this one runs on anything.

  • @3a146
    @3a146 Месяц назад +1

    Apple machines are the closest type to the so-called AIPC.

  • @m4rt_
    @m4rt_ Месяц назад

    btw, on the iPhone, Apple may be pushing for privacy, but they are just limiting who can access user data to just themselves so they will be the company that controls the ads, etc.
    Their own apps don't follow the privacy rules that they force upon third party apps.

  • @BurkeJones
    @BurkeJones Месяц назад

    Yes, more of this

  • @agxxxi
    @agxxxi Месяц назад +1

    Apple is working on smaller models because they’ve already behind the curve at this moment - starting small gives them a push forward learning the ropes of optimised LLMs. And most importantly, because they’re probably targeting this to work on their mobile iOS devices locally.

  • @hldfgjsjbd
    @hldfgjsjbd Месяц назад

    So that you know, showing RUclips chat on Twitch during stream is restricted and can lead to ban

  • @MichaelYagudaev
    @MichaelYagudaev Месяц назад

    💯 more of these types of videos 😊. Exciting to see what Apple can do.
    Another note is on their unified memory architecture which allows them to use the same RAM for both GPU and CPU. This isn’t true for other PC desktops with separate RAM for CPU and GPU has its own memory.

  • @insu_na
    @insu_na Месяц назад +3

    If you don't allow your developers to identify with the products they build then they can't be poached by competitors (instead they'll get sick of being treated like garbage and leave for competitors by themselves)

  • @arbifox187
    @arbifox187 Месяц назад +1

    What browser??

  • @Ollerismo
    @Ollerismo Месяц назад

    You dont need a gpu. You can download the model and run it in your Mac

  • @GrantCelley
    @GrantCelley Месяц назад

    LLama was trained on public data also

  • @SilkCrown
    @SilkCrown Месяц назад +1

    I would like to know more about why you think Apple and Mistral are at the top of the AI field. As far as I can tell their models don't perform as well as Meta's, OpenAI's, or Google's on either standard or qualitative tests. Merely publishing papers and open sourcing things is not an indicator of competitiveness.

    • @ghost-user559
      @ghost-user559 Месяц назад +3

      Because hardware is a huge part of that comparison. Between everyone you mentioned, none of them fab their own chips. Apple actually has Neural engines that are becoming some of the more competitive in the mobile and desktop space. Especially with the unified architecture with the Apple Silicon Macs, because the Vram is effectively slightly less than the Ram, which can go up to almost 200GB on the Ultra and Max chips. That would be many multiples of a conventional gpu, and its all in one chassis, and that doesn’t include the Neural cores on top of that.

  • @MaiChaMH
    @MaiChaMH День назад

    When Apple of All is more "OPEN" than "OpenAI"

  • @thisaintmyrealname1
    @thisaintmyrealname1 Месяц назад

    i read on a newsletter that Meta is pushing hard to commoditize LLMs, so that data becomes the actual competitive advantage. At first I thought this was the case with Apple too, but as you pointed out, they don't have that much data about the users in the first place... so maybe their interest is in achieving performant & small enough language models such that they can be embedded and finetuned directly on the end device? 😮

  • @RickySupriyadi
    @RickySupriyadi Месяц назад

    tried 3B Microsoft phi3 it's not so bad but... for current reason to use can't do.... and this one parameters in Millions? i wonder how it's going perform.

  • @kendrickpi
    @kendrickpi 17 дней назад

    Want to know which Apple hardware (powered by neural cores) gets to ride this AI wave? I fear Apple will focus buzz on their latest hardware to drive new sales, rather than eking out something for all their neural processing capable devices. Really wish to see a complete table of ALL Apple ‘ai’ hardware and the potential likelihood of each device/ARM chip running OpenELM/another Apple ai.

  • @lewismcclelland
    @lewismcclelland Месяц назад +3

    Woah, that's so much smaller than Llama, I wonder if it could even be used on phones?

    • @plaintext7288
      @plaintext7288 Месяц назад

      on Apple iPhones*

    • @Mitch-xo1rd
      @Mitch-xo1rd Месяц назад +5

      Might fit into ram, but performance is another story. No CPU can run an llm at reasonable speed with billions of parameters. But if any company can do it, I could see apple optimizing it specifically for their hardware.

    • @ninjabastard
      @ninjabastard Месяц назад

      @@Mitch-xo1rd most apple chips have a gpu or npu so they probably have enough processing, especially if it's under a 1b parameters. A rasberry pi can run a 1b model.

    • @pantium98
      @pantium98 Месяц назад

      you can run quantized llama3 8b model on your phone which is way powerful than OpenElm

  • @motionthings
    @motionthings Месяц назад +1

    Thinking that Apple is a privacy focused company. lol

  • @Tyheir
    @Tyheir Месяц назад +1

    WWDC is going to be sick

  • @kublaios
    @kublaios Месяц назад

    They will likely put an AI-oriented chip in the next iPhone and other devices that will work with their models the best. They will also brag about how open their AI ecosystem is. They’re good at this.

  • @jamesdavis1239
    @jamesdavis1239 Месяц назад

    I'd be interested in more AI content. I just started with stable diffusion & llama 3 and I am so lost. It feels so different from writing & using software.

  • @Trizzi2931
    @Trizzi2931 Месяц назад +15

    6:30 actually apple (atleast macos not sure about ios) collects a lot of telemetry data by default if you don’t turn them off. It contains all the personal information like your browsing history to app usage and stored credentials. They also share these information to other third party apps that are partnered with apple. And this is from their privacy policy who knows what else they might be collecting. They just market themselves as privacy centric but in reality they aren’t. Out of the box macos is as bad as windows in terms of data collection.

    • @Chimps-fj2ez
      @Chimps-fj2ez Месяц назад +6

      “Apple respects privacy”
      And I am Batman.

    • @asdfghyter
      @asdfghyter Месяц назад

      @@Chimps-fj2ez they don’t respect your privacy, they just protect your private data from third parties so they have exclusive access to it

  • @yodancristino
    @yodancristino Месяц назад

    I wonder what they are not sharing in the recipe

    • @Kabodanki
      @Kabodanki Месяц назад +1

      they are not sharing the censorship layer for example

  • @insain9302
    @insain9302 Месяц назад +1

    I would like to see some AI content!!

  • @lucasteo5015
    @lucasteo5015 Месяц назад

    I think all big tech AI companies should join the AI open source space too, I know companies aim for profit, but generally they are stable enough to share stuff like this, its just nicer to have smart people sharing knowledge around and improve it all together, its doesn't have to be a competition every time, people would disagree but I'm sure all scientists will agree on sharing knowledge.

  • @famnyblom6321
    @famnyblom6321 Месяц назад

    It is hard for closed companies to get good researchers and keep them from publishing when they can get well paid jobs where they can. This was one of the main reasons Meta made their models open source.

  • @Leto2ndAtreides
    @Leto2ndAtreides Месяц назад

    I think it may be like the Meta situation... The top AI researchers want to be able to publish.
    And if Apple wants them, they have to bend.
    Of course, that wasn't a problem for Meta culturally since they were already active in opensource. But Apple... Was probably forced to make that choice.

  • @Create-The-Imaginable
    @Create-The-Imaginable Месяц назад

    Please cover AI more like your life depended on it! And I am not kidding either! 🤔

  • @JohnKing93
    @JohnKing93 Месяц назад +1

    Please cover more AI 🙏🏽😄🤖

  • @thripnixe
    @thripnixe Месяц назад

    isheeps "oh my god apple release an open source llm"
    normal people "meh"

  • @CrushedAsian255
    @CrushedAsian255 Месяц назад +2

    kinda worried that my m3 computer won't be able to run all the fancy ai stuff, they're probably gonna release M4 chips just for the AI stuff

    • @NicolasSilvaVasault
      @NicolasSilvaVasault Месяц назад +1

      that's exactly what the rumors are going for, m4 won't be a big upgrade to m3, but focus on ai

    • @AndrewSu180
      @AndrewSu180 Месяц назад +2

      I imagine it will have some AI improvements, but models run great in M3s...

  • @christiancrow
    @christiancrow Месяц назад

    Google is by far the slowest and needs a fingerprint to even work , Siri is hand free for up to how far she hear you clearly , and Siri works offline for what I need , try doing a calculator offline then tell me different
    And I hope apples elm will be on Google pixel since it needs a huge upgrade to copy apples offline mode

  • @Pentross
    @Pentross Месяц назад +17

    They don’t call it AI because nothing is AI yet, it’s just machine learning

    • @RobertMcGovernTarasis
      @RobertMcGovernTarasis Месяц назад +2

      This, and it bugs the heck out of me that its still called AI

    • @maidenlesstarnished8816
      @maidenlesstarnished8816 Месяц назад +6

      This argument always seems so odd to me. How do you define intelligence? What would it take for an artificial system to be considered intelligent? Because I think to most people, intelligent behavior IS intelligence.

    • @David_Box
      @David_Box Месяц назад +2

      pointless semantic arguments, these terms don't have any widely agreed upon rigurous definition, they're more so marketing buzzwords

    • @edism
      @edism Месяц назад +3

      Lol they're one and the same. Back in the day even photocopiers and scanner software came with AI in the form of OCR. Machine learning is merely training it.

  • @AndrieMC
    @AndrieMC Месяц назад

    Fucking mojang is gonna make an ai language model at this point

  • @isbestlizard
    @isbestlizard Месяц назад +3

    It's stochastic parrots all the way down

    • @MagnumCarta
      @MagnumCarta Месяц назад +1

      Good band name.

    • @4.0.4
      @4.0.4 Месяц назад +1

      There is ternary quantization so it could literally be just many if statements.

    • @v_6302
      @v_6302 Месяц назад

      AI Models have shown to generalize beyond their training data...

  • @StiekemeHenk
    @StiekemeHenk Месяц назад +18

    PSA: Apple isn't private and they do collect a lot of data. They just don't like others doing so and aren't a data or advertisement company. So they don't share your data. They sell you expensive devices and software.
    They market a lot for privacy, but Apple's privacy claim only holds if you'd rather have all data with 1 company, instead of most data over 20 companies.

    • @pencilcheck
      @pencilcheck Месяц назад

      How is it different from google? sounds like google is even worse, since google not only want you to have data in 1 company but they sell your data and uses for marketing and ads instead.

    • @NastyWicked
      @NastyWicked Месяц назад +1

      lol

    • @bill_the_duck
      @bill_the_duck Месяц назад +1

      Sure, you can say they collect “a lot” of data because they collect some and they’re a massive company, but they collect a LOT less data than their competitors.

    • @sofianikiforova7790
      @sofianikiforova7790 Месяц назад +1

      1 company…. And the US government… lol

    • @StiekemeHenk
      @StiekemeHenk Месяц назад

      @@bill_the_duck location, health, fitness, location, email, name, age, adress, gender, 10 categories of interest, device information, keyboard language, carrier, all search queries, basically anything in their ecosysteem. Some like health and fitness are opt-in.
      And of course as much as possible is "anonymized" but undoing that is of course as trivial as riding a bike. We all know that.
      The idea that they collect less data than others is wrong. They even purchase data from third party data brokers for their first-party ads.
      Apple does however show good intent by not sharing data and actually processing a lot on-device.

  • @Cfitterhappier
    @Cfitterhappier Месяц назад

    One thing to note... Their hugging face score for the MMLU is 25... this means it's almost guessing at random. This is a extremely bad score... which means it's comprehension of language is going to very low.
    It will probably be better than Siri

  • @mauricetroisville646
    @mauricetroisville646 Месяц назад

    Is the video sponsored by apple or is it just fanboyism?
    The video summarized: apple is amazing! 🎉
    The most funny thing though is that you say that apple is so privacy centric that they have no data they could use 😅

  • @sleev1091
    @sleev1091 Месяц назад

    There are so many AI tools that seem interesting, but the fact that they run remotely and collect as much data off of users' devices makes me not consider using them for even a second. Apple running AI models locally sounds really appealing.

  • @willsims
    @willsims Месяц назад

    Less params for mobile! :)

  • @a3onz3ro31
    @a3onz3ro31 Месяц назад +1

    llama 1 in paper showed that it's possible to build good llm just with open-sourced dataset. meta is still the king of open sourcing stuffs on ml (pytorch, datasets, llama itself ). It's nice to see that apple is doing same. But in their other papers on CLIP, doesn't provide the dataset. I guess if meta didn't open source, none of these google, Microsoft or Apple would. In names of safety what OpenAI is doing , I guess one day, their AGI will laugh at themselves like "you humans called that AGI? so stopped contributing in research to make money? No wonder you guys want to get extinct" 🤖

    • @v_6302
      @v_6302 Месяц назад

      Lamas wheigts were accidentally leaked on github and downloaded by other people. They had no choice but to open source it, so that they could at least force via a license that it can only be used noncomercial. Any larger open source model then 7B Parameter is based on Llama, either through finetuning it or using it to generate trainingdata for smaller models. But this cost about 1% of the total training cost that meta paid to train it. The data was never the problem, many datasets are open source. The problem is the training (llamas training cost was i think around 2 Million Dollar). No open source entity can afford such prizes.
      The "no moat" and "opensource is awesome" from open ai is so they can say they arent a monopoly. But open source cannot compete...

  • @canadiannomad2330
    @canadiannomad2330 Месяц назад +1

    I get the feeling big players are trying to team up to break openai with the help of open-source...

    • @Mitch-xo1rd
      @Mitch-xo1rd Месяц назад +2

      God I hate how Openai is all closed source, those fuckers turned their back on the people who made them big. Without all the open source community they would never have been able to get enough popularity to get the VC money that made them king of ai

    • @estiennetaylor1260
      @estiennetaylor1260 Месяц назад

      @@Mitch-xo1rd Open source is a place for freeloaders.

    • @crism8868
      @crism8868 Месяц назад

      Didn't Microsoft manage to get one of their guys on OpenAi board of Directors? They can either break it or control it depending on which way the winds of regulation are blowing

    • @estiennetaylor1260
      @estiennetaylor1260 Месяц назад +1

      @@crism8868 Microsoft doesn't need OpenAI to function. Can't say about OpenAI without Microsoft.

    • @estiennetaylor1260
      @estiennetaylor1260 Месяц назад +1

      @@crism8868 Satya Nadella said in a interview stating Microsoft owns all of OpenAI patents and technology behind it. They're more than capable of creating a inhouse AI by themselves.

  • @Owenvfx247
    @Owenvfx247 Месяц назад +4

    NO WAY 💀

  • @niciusb
    @niciusb Месяц назад

    You could say this video is a reciple for glazed apples because the fanboyism on this one is crazy

  • @feitan8745
    @feitan8745 Месяц назад

    It's open until it's not open anymore.

  • @hlokomani
    @hlokomani Месяц назад +1

    Safe to say the next WWDC might be mind blowing (pun intended)

  • @spounka
    @spounka Месяц назад

    We have Apple doing open source and amazing contributions
    And we have Canonical doing absurd decisions and pushing users away
    Strange times

  • @theshy6717
    @theshy6717 Месяц назад

    I feel like this is a move to cover what they have been gatekeeping and trying to win devs trust again, they're still the most evil among all other tech giants contributing the least but gatekeeping the most

  • @gnaarW
    @gnaarW Месяц назад

    Was this video sponsored by apple? =D

  • @ustav_o
    @ustav_o Месяц назад

    apple and open source in the same phrase?

  • @s3rit661
    @s3rit661 Месяц назад

    Apple arriving 1 yerar later in the AI Race
    Cmon Apple, just pay OpenAI's fees :D

  • @succatash
    @succatash 24 дня назад

    Good AI helps apple

  • @pencilcheck
    @pencilcheck Месяц назад

    I don't think open sourcing has to do with stock getting lower, people who invest in stock all knows those are just typical fluctuations and manipulations from the stock brokers and banks. If you don't know then you just never invest before.

  • @DomesticMouse
    @DomesticMouse 23 дня назад

    Moar AI content plz

  • @anonymouscommentator
    @anonymouscommentator Месяц назад

    apple clearly tries to develop small models for phones. this is not a llama competitor but a microsoft phi 3 competitor. unfortunately that will mean the performance will be meh at vest compared to something like llama 8b 😅