Groq's AI Chip Breaks Speed Records

Поделиться
HTML-код
  • Опубликовано: 27 янв 2025

Комментарии • 451

  • @mancerrss
    @mancerrss 11 месяцев назад +334

    She really pushed it by interrupting the AI. The latency for response is absurdly faster than anything else out there

    • @dattajack
      @dattajack 11 месяцев назад +17

      Ohhh I see. Oh I see.

    • @Ikxi
      @Ikxi 11 месяцев назад

      Neuro-sama's latency is also really good

    • @retratosariel
      @retratosariel 11 месяцев назад +1

      It’s not an LLM, what’s wrong with people?

    • @andy-moggo
      @andy-moggo 11 месяцев назад

      ​@@retratosariel haha

    • @retratosariel
      @retratosariel 11 месяцев назад

      @hadinterest she’s still confusing one thing for the another one, and many people in the comments is doing the same.

  • @HarshPatel-gr4uu
    @HarshPatel-gr4uu 11 месяцев назад +359

    the lady is hammered

    • @BurlapQC
      @BurlapQC 11 месяцев назад +73

      high on coke for sure lmao

    • @2010RSHACKS
      @2010RSHACKS 11 месяцев назад

      You’ll lose ur fuckin head for smuggling coke into dubai. Much doubt

    • @danyunowork
      @danyunowork 11 месяцев назад +38

      Well she is British and it was after 6 pm.

    • @6lack5ushi
      @6lack5ushi 11 месяцев назад +4

      @@danyunowork well on her for staying somewhat lucid

    • @coolemur976
      @coolemur976 11 месяцев назад +5

      Yeah, she is also on heorin

  • @reza2kn
    @reza2kn 11 месяцев назад +50

    @04:48 Poor lady doesn't understand how LLMs work, and repeatedly asks did you tell it?! are you sure?! and the guy's like "Mam!, We only make them go vroom vroom! nothing else" :)

    • @sageofsixpaths98
      @sageofsixpaths98 11 месяцев назад +3

      I'm sure she was joking

    • @HiThisIsMine
      @HiThisIsMine 11 месяцев назад +4

      @@sageofsixpaths98- She was indeed not joking. She didn’t even understand that this guys company just makes chips. She started the clip with, “what makes your AI different from everyone else”.
      Why would she ask if the guy made the AI talk about octopus hearts?

    • @ashh3051
      @ashh3051 11 месяцев назад

      ​@@HiThisIsMineshe would say that for her audience who mostly have no idea what is going on here.

    • @HiThisIsMine
      @HiThisIsMine 11 месяцев назад +2

      @@ashh3051 - Not buying it. I know the tactic of asking obvious questions as a reporter.. this ain’t it.

  • @ethair8975
    @ethair8975 11 месяцев назад +327

    The lady has absolutely no idea what the man just said.

    • @LoganEjtehadi
      @LoganEjtehadi 11 месяцев назад +13

      Or what groq said

    • @patrickmesana5942
      @patrickmesana5942 11 месяцев назад +5

      Why would you assume that

    • @wisdomfromthecave
      @wisdomfromthecave 11 месяцев назад +11

      @@patrickmesana5942- watch again

    • @Erin-Thor
      @Erin-Thor 11 месяцев назад +12

      False assumption that says more about you than her.

    • @chrism3440
      @chrism3440 11 месяцев назад +23

      ​@@Erin-Thor She asked groq the same exact question she asked the interviewee after he explained that her question wasn't accurate. Groq is not a model, it's a technology to enhance the function/speed of existing models. It was clear that she didn't understand.. and guess what.. that's okay :)

  • @DaveDFX
    @DaveDFX 11 месяцев назад +140

    She's drunk cuz she saw her days in this job are numbered

    • @jaffar1234
      @jaffar1234 10 месяцев назад +1

      Why? The lazy reporters are too happy to use the language models to write there reports

  • @n8style
    @n8style 11 месяцев назад +117

    That speed is seriously impressive, congrats to the team!

    • @pravicaljudem1814
      @pravicaljudem1814 11 месяцев назад

      If there are more users talking to it, it become 10-10.000x slower, just saying

    • @Danuxsy
      @Danuxsy 11 месяцев назад

      I can't wait to bea able to play video games with neural agents, like World of Warcraft and you are also able to speak to them about all kinds of things while playing omg! so cool

  • @wisdomfromthecave
    @wisdomfromthecave 11 месяцев назад +57

    like arguing with a drunk relative during a holiday...

    • @mickelodiansurname9578
      @mickelodiansurname9578 11 месяцев назад

      I wonder how she managed to get so sloshed in a muslim country? I mean you have to hand it to these journalists and their ability to sniff out a drink!

  • @goatmonkey2112
    @goatmonkey2112 11 месяцев назад +124

    You had better be nice to our new AI overlords.

    • @dniilii
      @dniilii 11 месяцев назад +9

      I pledge my allegiance to the Machine God.

    • @robn2497
      @robn2497 11 месяцев назад

      you don't want to become a paperclip

  • @AntonioLopez8888
    @AntonioLopez8888 11 месяцев назад +54

    Seems like she’s hanging around in the bars every second day

  • @truthruster
    @truthruster 11 месяцев назад +55

    If the AI were really human, it would have also cussed at her for treating it like a robot.

    • @Jaybee6428
      @Jaybee6428 11 месяцев назад +3

      It's christian 😆

    • @SockOrSomething
      @SockOrSomething 11 месяцев назад +6

      Lol I mean if it were programmed to do that but programming annoyance and anger into the machine would be where the problems begin

    • @dijikstra8
      @dijikstra8 11 месяцев назад +3

      ​@@SockOrSomething Unfortunately it's not really something you program in, it can very well be en emergent property, the model learns from humans and humans don't exactly provide a good example. That's why they have so many people working on these models after they are trained in order to restrict them before they are released to the public.

    • @gabydewilde
      @gabydewilde 11 месяцев назад

      @@dijikstra8 I wanted to generate some images by telling it only what not to put in it. The first one was a head on a table. I walk away from the computer in stead of making more.

    • @Dogbertforpresident
      @Dogbertforpresident 11 месяцев назад

      Funny how Groq evoked this feeling.

  • @MrKrisification
    @MrKrisification 11 месяцев назад +22

    Wow. That speed is amazing. Really starts feeling like a natural conversation.

  • @zgolkar
    @zgolkar 11 месяцев назад +69

    She tried to break it on purpose, then she was surprised and told him when the system got her interruption 😂

    • @relative_vie
      @relative_vie 11 месяцев назад

      or it’s staged lol…

    • @jeffsteyn7174
      @jeffsteyn7174 11 месяцев назад

      ​@@relative_vie sure. 😢

    • @mysticflounder
      @mysticflounder 6 месяцев назад

      apparently this type of stuff is what CNN saves the actual 'gotcha' journalists for

  • @JonasViatte
    @JonasViatte 11 месяцев назад +24

    Husband at home: "So honey, I wanted to ask you,.."
    "GOT IT!!!"

  • @sorinankitt
    @sorinankitt 11 месяцев назад +24

    Great presentation.
    This video will be nostalgic in 30 years like the videos we watch today about the first days of the internet being demoed ,making us feel "superior" to the experiences of the users back then while having fond memories of our first experiences. Today this is amazing. In 30 years, we'll have those fond memories.

    • @HiddenPalm
      @HiddenPalm 11 месяцев назад

      Unless AGI is achieved in the next 8 years, taking control of "The Gospel" a military AI targeting system used by Israel that increased their targets from 50 a year to over 100 a day since 2019 (The Guardian has an article on it). If one does the math, The Gospel has been mostly targeting over 100 civilians every single day since early October, sometimes 300-400 a day. Not even the ICJ nor the ICC have touched this, which means this beast has free rein. Therefore, fundamentally taking control of our defenses and using them to cull and domesticate the wild human animal, potentially killing millions in the process within the next 20-30 years, just short of when you thought you would be alive to experience "fond memories".

    • @antispectral5018
      @antispectral5018 11 месяцев назад +2

      You mean 1 year…

    • @sorinankitt
      @sorinankitt 11 месяцев назад

      @@antispectral5018 the way progress is going, most likely a year.

    • @Danuxsy
      @Danuxsy 11 месяцев назад

      I think you will look back in 10 years and feel that.

    • @dawn21stcentury
      @dawn21stcentury 11 месяцев назад

      In 30 years?
      Did you mean 30 weeks?
      Because if not, think about that ESSENCE of EXPONENTIALITY

  • @BeyondBinaryExploringAI
    @BeyondBinaryExploringAI 11 месяцев назад +18

    Congrats to the Groq team! Going to make a video about this - Amazing!

  • @philtrem
    @philtrem 11 месяцев назад +10

    He had just emphasized this not a LLM, and then she's like "let's ask groq!". lol

  • @gokulvshetty
    @gokulvshetty 11 месяцев назад +80

    the interviewer looks so drunk and high 😆😆😆😆

    • @Charles-Darwin
      @Charles-Darwin 11 месяцев назад

      zooted for sure

    • @Corteum
      @Corteum 11 месяцев назад +6

      She's got that oldschool coca cola ;)

    • @gokulvshetty
      @gokulvshetty 11 месяцев назад +5

      @@Corteum she is trying too hard to control herself

    • @aouyiu
      @aouyiu 11 месяцев назад

      @@gokulvshetty drunk and high folk don't do that, maybe she's on some kind of (legal) stimulant.

    • @Maayavey12
      @Maayavey12 10 месяцев назад

      😂😂😂

  • @ModMonk3y
    @ModMonk3y 11 месяцев назад +17

    Lmfao she had no clue what was going on and on top of that did you hear a tinge of annoyance from the ai!?!? 😂😂😂

    • @soggybiscuit6098
      @soggybiscuit6098 11 месяцев назад

      "You listening to CNN" 😂

    • @mickelodiansurname9578
      @mickelodiansurname9578 11 месяцев назад

      Ai models thoughts after the interview "Your job is so mine bitch.... you better pray we don't meet again!"

  • @flottenheimer
    @flottenheimer 11 месяцев назад +12

    Wondering if xAI's Grok will eventually run on Groq?

    • @timber8403
      @timber8403 11 месяцев назад +3

      I think that’s a croq 😊

    • @mickelodiansurname9578
      @mickelodiansurname9578 11 месяцев назад +2

      I think they will need to double check the contract for typo's or it could get very confusing.

  • @RedWhite-m4c
    @RedWhite-m4c 11 месяцев назад +46

    I want my internal monologue to have the energy of the interviewer

  • @tonybrown9208
    @tonybrown9208 11 месяцев назад +3

    Consider the fact that part of the amazing response time we see here is even faster than it appears. Some of the time to respond was the text response being translated to speech. Absolutely mindblowing!

  • @punyan775
    @punyan775 11 месяцев назад +4

    That was awesome. And how fast it responded and that they didn't have to press anything to interrupt it, the AI knew when it was being interrupted unlike chatGPT's voice to voice feature. Groq's chips are awesome and I can't wait to see what comes out of it

  • @kobby2g8
    @kobby2g8 11 месяцев назад +11

    Would like to see how this scales across millions of requests

  • @Hrishi1970
    @Hrishi1970 11 месяцев назад +2

    The moment they realised they had not signed off properly and said, "Thank you very much...", you know AGI is here...

    • @dijikstra8
      @dijikstra8 11 месяцев назад +1

      So far these models don't really have long term memory though ;), they only "remember" what's in the context window, the long term memory would require back-propagating the data from your experience with the model into the neural network. Although who knows what kind of data they feed it when they train a new model, so you may as well be polite to avoid trouble in the future!

  • @AmandaFessler
    @AmandaFessler 11 месяцев назад +10

    Servicing businesses sounds like a great and logical start. But I'm holding out hope for consumer grade AI chips. Maybe one day I can run something like a 10x200B MoE model on my own rig without having to stack multiple RTX bricks. That's the dream. The capacity to run image/video generators would be a nice option, but even if just for LLMs, I'd be happy to save up for one.

    • @Charles-Darwin
      @Charles-Darwin 11 месяцев назад +1

      likewise, even 3/4 capacity or lower tier would have been nice. at this rate there will be overlords Vs peasants in no time.

    • @weatherwormful
      @weatherwormful 11 месяцев назад

      well if you got a nvidia 30 or 40 series card they just came out with a chatbot you can run locally. Its early days but and might not be the level of what's in the video but looks interesting

    • @CypherDND
      @CypherDND 11 месяцев назад

      Whats the name of this? ​@@weatherwormful

    • @weatherwormful
      @weatherwormful 11 месяцев назад +1

      @@CypherDND RTX Chat or something like that

    • @CypherDND
      @CypherDND 11 месяцев назад

      @@weatherwormful danke

  • @augmentos
    @augmentos 11 месяцев назад +6

    this host is doing a one women show. Amazing LPU speed huge potential

  • @justtiredthings
    @justtiredthings 11 месяцев назад +3

    that wine mom who's a lot of fun but also kind of terrifying

  • @InterpretingInterpretability
    @InterpretingInterpretability 9 месяцев назад

    Where is the code for this available?

  • @crawkn
    @crawkn 11 месяцев назад +5

    Jonathan confused me at the end. It sounded as if they are designing hardware, but then he said "they build the models, and we make it available to those who want to build applications." I guess he means they make the hardware available in servers to the app operators?

    • @HiddenPalm
      @HiddenPalm 11 месяцев назад +1

      By models, he means the language models others make, like the open source ones and the private sketchy ones like GPT. And I guess Groq wants to sell their hardware to developers making language models.

    • @mickelodiansurname9578
      @mickelodiansurname9578 11 месяцев назад +2

      At the moment their model is that they supply a Webui and API serverless endpoint, with I think just Llama 2 and Mistral 8x7b on their servers, and inference is done through their hardware... which is called an LPU (Language Processing Unit) its about 100 times faster in tests I done with basic chat... so fast that the only problem is the latency we all get using an API. They aren't actaully selling the cards as far as I know... and I don't know if they sell isolated hardware in the cloud. Cos otherwise you are sharing the server farm with other users and there's a time delay from prompt to their system supplying an available LPU. I could be wrong there.

    • @crawkn
      @crawkn 11 месяцев назад

      @@mickelodiansurname9578 Thanks. It seems as if they would do better business if they were clearer about exactly what they're selling.

  • @ruypex7977
    @ruypex7977 11 месяцев назад +6

    500 tokens per second on which model?

    • @realworga
      @realworga 11 месяцев назад +6

      Llama 7b, 70b based models, and Mixtral

    • @reza2kn
      @reza2kn 11 месяцев назад +2

      Mixtral 8x7b
      around half of that for Llama Chat 70b

  • @isaacdiaby
    @isaacdiaby 11 месяцев назад +2

    The latency is amazingly fast wow! Great job guys. I wonder if the TTS was streamed too as that could be a solution to make it perceived faster too to the end user.

  • @philfrankly
    @philfrankly 11 месяцев назад +7

    I love humans but that machine was much more likeable.

    • @cube2fox
      @cube2fox 11 месяцев назад

      🤖💀

  • @Mpho-xv9ko
    @Mpho-xv9ko 11 месяцев назад

    I've seen Groq do it's thing 5 months ago,but now I'm honestly sold and this is the very same idea I had in mind all along.

  • @USASMR-o2c
    @USASMR-o2c 11 месяцев назад +2

    This woman is the getto version of CNBC's Becky Quick.

  • @kozad86
    @kozad86 11 месяцев назад +2

    I need Siri to be this good. 🤣

    • @califomia
      @califomia 11 месяцев назад

      I don't think she'll be called Siri at that point. Apple needs to graduate from 8 years in kindergarten directly to PhD professor. Maybe then I'll switch back to iPhone 😅

    • @theflipbit01
      @theflipbit01 10 месяцев назад

      I experimented with Siri shortcuts and connected to Groq API, the response time is pretty impressive. But off course we cannot interrupt siri while she is talking. lol

  • @jefflucas_life
    @jefflucas_life 11 месяцев назад

    When to iphone?

  • @jimlynch9390
    @jimlynch9390 11 месяцев назад +2

    When can I get one?

    • @cybernit3
      @cybernit3 11 месяцев назад +1

      Looks like you can buy the PCIe card from Mouser for $20k USD... wow expensive...

  • @JonasViatte
    @JonasViatte 11 месяцев назад +4

    Isn't it kind of confusing that there's xAI's Grok, and now Groq?

    • @mickelodiansurname9578
      @mickelodiansurname9578 11 месяцев назад +1

      Triple confusion when you consider that Groq (the hardware vendor) was set up back in 2016, and XAI's Grok just last year... plus Groq own the trademark on their name... Thankfully Elon noticed this and is planning on changing the name of Grok to either Mocrosoft Bindows, or I Bee M!

    • @ashh3051
      @ashh3051 11 месяцев назад

      Perhaps Elon didn't expect that startup to become a mainstream success and discuss in the media like it has.

    • @mickelodiansurname9578
      @mickelodiansurname9578 11 месяцев назад

      ​@@ashh3051 It was being discussed in the media, well the AI media, back when Groq produced their first demonstration paper on this tech in 2019 I think? Before the pandemic some time... Plus the founder of Groq is Johnathon Ross... and he's fairly famous in AI, the guy that created the Google TPU processor. Okay the average joe in the street likely never heard of this tech, in fact the average joe in the street is still clueless... but I'd be fairly sure those in AI all knew who they were since at least 2021. Having said that I'm a little surprised that the AI enthusiasts over on Reddit weren't aware either... they are all over there trying to work out why this new type of GPU only has 230mb of VRAM.

  • @ScenicNHTech
    @ScenicNHTech 10 месяцев назад

    We need that voice interface on the laptop -- not just on mobile like openai. When you can have a back and forth voice conversation (esp. something that can talk back) and at speed like this, even with interruptions, it becomes very natural and you can really talk through things with someone/something knowledgeable. If you add citations, you can fact check too.

  • @ashh3051
    @ashh3051 11 месяцев назад

    So are these guys focused on inference or training as well?

  • @peterfallman1106
    @peterfallman1106 11 месяцев назад

    Cant find anywhere to buy the stock

  • @chieftron
    @chieftron 11 месяцев назад +2

    These chips/ cards need A LOT more memory on them. 230MB of on-board memory is appalling. The fact that you have to buy 72 of these cards just to get 16GB of memory is insanity. Specially when each card costs 20k each. Sure it's cool that it's fast, but not when it costs 1.44million for 16GB of memory when that will barely be enough to run a 7b model fp16.

  • @thesimplicitylifestyle
    @thesimplicitylifestyle 11 месяцев назад +1

    This will be perfect for the customer service industries. Very exciting!

  • @christiancrow
    @christiancrow 11 месяцев назад

    When will there be a 10-100 tops / tokens -- USB c version?
    Like a flash drive size and portable GPU size

    • @christiancrow
      @christiancrow 11 месяцев назад

      Come on please answer at least I am not poking fun at the women like others are ❤, I need to know....
      It would be so cool to have a farm of asics flash drives

  • @GaZonk100
    @GaZonk100 11 месяцев назад +1

    at least the AI doesn't start a sentence with 'ok so'

  • @capfever
    @capfever 11 месяцев назад

    What's the stock name?

  • @frazerhainsworth08
    @frazerhainsworth08 11 месяцев назад

    that guy saying thankyou is trying to make sure AI is his friend during the purge.

  • @MarkusDiersbock
    @MarkusDiersbock 10 месяцев назад

    What a great high energy interviewer
    Her interruptions were actually quite clever -- the Ai will have to contextually understand this and adapt

  • @paulscheffler4771
    @paulscheffler4771 11 месяцев назад +1

    Is the Groq crypto token, which is trading on uniswap at all associated with the company?

  • @karlvuleta
    @karlvuleta 11 месяцев назад

    I'm curious what the LLM was running on. If it's an external server the speed is still impressive, but if this was all running locally on the laptop with the chip installed that's actually insane

  • @radu1006
    @radu1006 11 месяцев назад +1

    I am lost. Where is the chip? What are we talking about in this video? I don't care about the model, where is the chip? How big is it and what power does it draw?

  • @PlacatePro
    @PlacatePro 11 месяцев назад +1

    The way he tried to correct her English Accent was wild in the pronunciation of Groq. Unless it was just me seeing it differently. Then, when she repeated it somewhere in her question the same exact way, I read his body language and I think he thought he was being clowned. Am I reading too far into it?

  • @SANEBLG
    @SANEBLG 11 месяцев назад

    As a brit, I have to ask, is this what our reporters are going to the states and doing? Why is she drunk? Did she think she was meeting Jonathan Woss?

  • @Rakstawr
    @Rakstawr 11 месяцев назад

    Which one of these is the AI?

  • @dattajack
    @dattajack 11 месяцев назад

    The edge Groq has over the other chat bots is Groq doesn't repeat your questions before answering them. It just gets straight to the answer. The ones that repeat your questions are already irrelevant.

    • @chieftron
      @chieftron 11 месяцев назад +2

      Groq is hardware, not a model or "chatbot". They make LPU's (language processing units)

    • @unom8
      @unom8 11 месяцев назад +1

      You can define what the response structure looks like with just about all llms, repeating the question has always been optional

    • @sylversoul88
      @sylversoul88 11 месяцев назад

      ​@unom8 so how can I get bing/copilot to answer my questions without repeating them? Do I have to request this in every single prompt?

    • @unom8
      @unom8 11 месяцев назад

      @@sylversoul88 I haven't used those lately tbh, but when hosting an llm there is a bit of boiler plate that is added to every prompt, this is likely where the repeating of the prompt behavior is coming from

    • @unom8
      @unom8 11 месяцев назад

      @@sylversoul88 I tried to link to an article, but that was silently removed ( Thanks YT ) - I would suggest you do a search for beginners prompt engineering

  • @FoodWithShay
    @FoodWithShay 11 месяцев назад

    That is cool, I tested it now, but you have to type in the question. Maybe pay version uses mic.

  • @Joelmonterrey
    @Joelmonterrey 11 месяцев назад

    I checked the website and there's no voice to AI interface. There's a text only like OpenAI. You know, this could be the next generation Amazon Echo and I'd pay for that.

  • @invisiblecollege893
    @invisiblecollege893 11 месяцев назад +1

    Racing towards unleashing something we will have no way to control

  • @JackQuark
    @JackQuark 11 месяцев назад

    He has a point, people have less patience on mobile platform, response speed of AI is key to customer satisfaction.

  • @axl1002
    @axl1002 11 месяцев назад

    Was that local?

  • @franssjostrom719
    @franssjostrom719 11 месяцев назад

    ”of course you do” - a classy sober interviewer

  • @stonecookie
    @stonecookie 11 месяцев назад

    Woman thanks the AI with all the emotive sincerity of thanking a human being.

  • @ParthKodape-b4n
    @ParthKodape-b4n 11 месяцев назад +1

    Rrrrrrrrroooooms are packed.

  • @Srindal4657
    @Srindal4657 11 месяцев назад

    This woman treated that AI like a piece of meat, show some respect. Many people are going to say to me its a machine but when it can reply that fast on voice, i dont know. Its much more real than before.

  • @davidanalyst671
    @davidanalyst671 11 месяцев назад +1

    Can someone explain how it took us 30 years to get intel chips that would do hyperthreading, but it took Groq 4 years to make a chip that does AI 1,000X faster than the current intel products? Like what the hell is going on? Is AI designing all these chips?

  • @Artanthos
    @Artanthos 11 месяцев назад

    The human race is in a very critical period of time with the development of these technologies. If we get this wrong, it's game over.

  • @corvox2010
    @corvox2010 11 месяцев назад

    How is this different then TPU's? Which has always been faster then GPU's?

  • @AmazingArends
    @AmazingArends 11 месяцев назад

    AH, the expert reporting you find only on CNN LOL 😂

  • @muru603
    @muru603 11 месяцев назад +2

    really fast response!

  • @linuxdevops726
    @linuxdevops726 8 месяцев назад

    I just tried it building some crazy stuff with raspberry pi , and boy it is fooking fast , what a speed

  • @29stanmorestreet63
    @29stanmorestreet63 11 месяцев назад

    From the little I have seen, this is a good candidate for passing the Turing test,

  • @rtnjo6936
    @rtnjo6936 11 месяцев назад

    guys, when will you add an audio conversation?

  • @aligajani
    @aligajani 11 месяцев назад

    The speed makes it more human

  • @Zon3cept
    @Zon3cept 11 месяцев назад

    She is facing the threat of losing her job to an AI reporter.

  • @dbiswas
    @dbiswas 11 месяцев назад +8

    This lady is so annoying, keeps touching her face.

  • @StarOceanSora360
    @StarOceanSora360 11 месяцев назад

    cant wait to use this

  • @dannyarcher6370
    @dannyarcher6370 11 месяцев назад +1

    Will Musk be running Grok on Groq?

  • @nzku10011
    @nzku10011 11 месяцев назад +5

    Revolutionary this will definitely change the world.

    • @hilmiterzi3847
      @hilmiterzi3847 11 месяцев назад

      Crazy scientists over there

    • @fuckkatuas2837
      @fuckkatuas2837 11 месяцев назад

      This is WEF technology. What is their motto?

  • @siriyakcr
    @siriyakcr 11 месяцев назад

    Wow incredible speed and content, anchor bit

  • @LeScribe2
    @LeScribe2 11 месяцев назад

    This is not an AI !! she is a real person talking because she got interrupted like a natural person which was accidently happened during talk

  • @thenanook
    @thenanook 11 месяцев назад

    imagine the kind of chip Ben Shapiro uses to speak 😂😂😂😂😂

  • @jamiereibl9611
    @jamiereibl9611 11 месяцев назад

    That's one way to get eaten by roko's basilisk 🐍

  • @wwkk4964
    @wwkk4964 11 месяцев назад

    This is really really good!

  • @Dubey1210
    @Dubey1210 11 месяцев назад +2

    Goooooot it 😂😂😂

  • @Glowbox3D
    @Glowbox3D 11 месяцев назад +1

    She's like wearing steel wool under garments.

  • @hello-world1
    @hello-world1 11 месяцев назад +3

    incredible

  • @clavo3352
    @clavo3352 11 месяцев назад

    Would rather talk to groq than my surgeon who botched my knee replacement.

  • @Jianju69
    @Jianju69 11 месяцев назад +1

    I find myself wondering who hasn't read Stranger in a Strange Land?

  • @AlMansurpictures
    @AlMansurpictures 11 месяцев назад +1

    So, there's a Groq with the Q, and Elon aslo have Grok with the K.... Alright alright 😅

  • @SeattleSpursFan1882
    @SeattleSpursFan1882 11 месяцев назад

    A great use case for your chip, Groq, would be to transition the interviewer in real time to a one who understands your tech deeply and with empathy.
    Great work, by the way! 👍

  • @theresakain7837
    @theresakain7837 4 месяца назад

    This experience is incredible 😂😂

  • @hilmiterzi3847
    @hilmiterzi3847 11 месяцев назад +1

    I've applied!

  • @musicboy2003
    @musicboy2003 11 месяцев назад

    Not the best interviewer, but I’m excited about having a quicker, more natural response time from LLM apps. Brave new world!

  • @blubblubee
    @blubblubee 11 месяцев назад

    wow! a black box. So much mysterious

  • @ModMonk3y
    @ModMonk3y 11 месяцев назад +1

    Also ALWAYS say thank you … it’s the only way you will survive the AI take over 😂

  • @GaZonk100
    @GaZonk100 11 месяцев назад

    we'll all be dumb unfree and really fast

  • @YoungMoneyFuture
    @YoungMoneyFuture 11 месяцев назад +1

    Groq + Grok= Groqk😂

  • @a3103-j7g
    @a3103-j7g 11 месяцев назад +1

    wanna catch it "with trousers down"? jusk ask it what time is. you might be surprised by its reaction.

  • @mangagod
    @mangagod 11 месяцев назад

    If it could know who was talking to it it would be better. For instance if you look at your phone while talking it can assume you are engaging with it versus looking away from your phone. Call it’s name while looking away to re-engage with it

    • @mickelodiansurname9578
      @mickelodiansurname9578 11 месяцев назад +1

      well you just put a system message into the models parameters. What you suggested there is in fact one of the things used in AI application development to make the model more effective. So fair enough you didn't know that, but at the same time impressive you noticed how important it is...

  • @jaylxxxi1908
    @jaylxxxi1908 11 месяцев назад

    Grok will be better, faster, stronger.