How is THIS Coding Assistant FREE?

Поделиться
HTML-код
  • Опубликовано: 14 май 2024
  • Cody is Sourcegraph's AI coding assistant, and it has a couple of features that no other assistant has to make you a 10x developer. You can check it out here: srcgr.ph/ugzpQ
    🛒 Gear Links 🛒
    * 🍏💥 New MacBook Air M1 Deal: amzn.to/3S59ID8
    * 💻🔄 Refurb MacBook Air M1 Deal: amzn.to/45K1Gmk
    * 🎧⚡ Great 40Gbps T4 enclosure: amzn.to/3JNwBGW
    * 🛠️🚀 My nvme ssd: amzn.to/3YLEySo
    * 📦🎮 My gear: www.amazon.com/shop/alexziskind
    🎥 Related Videos 🎥
    🚀 LLMs: From Zero to Hero with M3 - • Zero to Hero LLMs with...
    😲 MacBook Performance: Developer Shocked! - • Developer Shocked by m...
    🤯 Mind-Blowing Machine Learning on Neural Engine - • INSANE Machine Learnin...
    🍏 Apple Silicon: Who Needs It? - • Apple Silicon is back,...
    🔗 AI for Coding Playlist: 📚 - • AI
    #programming #ai #softwaredevelopment
    - - - - - - - - -
    ❤️ SUBSCRIBE TO MY RUclips CHANNEL 📺
    Click here to subscribe: www.youtube.com/@azisk?sub_co...
    - - - - - - - - -
    📱LET'S CONNECT ON SOCIAL MEDIA
    ALEX ON TWITTER: / digitalix
    CODY ON TWITTER: / sourcegraphcody
  • НаукаНаука

Комментарии • 158

  • @JoeBurnett
    @JoeBurnett 5 месяцев назад +161

    My concern is that in order for the suggestions to be presented, your code has to be sent via the plug-in to Cody/CoPilot and “fed” to the LLM on their servers. How do we know that these services aren’t stockpiling users’ code that may one day be divulged to the internet due to a security leak?

    • @arnobchowdhury9641
      @arnobchowdhury9641 5 месяцев назад +37

      Very important question. Maybe that's why the feature, running your own LLM, is important.

    • @zzzyyyxxx
      @zzzyyyxxx 5 месяцев назад +17

      This is literally the case for every cloud LLM plugin in editors right now, it's honestly not that big of a deal. If one does care, they can link a local LLM.

    • @denisblack9897
      @denisblack9897 4 месяца назад

      I’ve been joking that I don’t upload to GitHub cause they train AI on our code… but now this joke isn’t fun anymore

    • @bdarla
      @bdarla 4 месяца назад +44

      (also) here applies the well-known: "If something is free, you are the product."

    • @npc-drew
      @npc-drew 4 месяца назад +1

      This is specially holds more true for young and hungry startups.

  • @garfieldlim7625
    @garfieldlim7625 4 месяца назад +3

    Hey man! I'm a senior in computer science. I enjoy your content so much. I really look forward to you making videos on Apple's MLX framework. I have a 14" M2 Max 30 core GPU and I'd really like to see how much elbow grease it could do vs other Macs.

  • @ikemkrueger
    @ikemkrueger 4 месяца назад

    It looks very promising. Testing it right now. Thanks for sharing it!

  • @joelcarlson8982
    @joelcarlson8982 5 месяцев назад +7

    I have been using a plug-in called pieces that does snippet storage, but they also have a chat feature where u can use cloud or local LLMs along with adding portions of your code to the model’s sight

    • @filmyguyyt
      @filmyguyyt 4 месяца назад

      same here

    • @getpieces
      @getpieces 2 месяца назад

      @@filmyguyyt Thanks for the support :)

    • @Stryker-K
      @Stryker-K 2 месяца назад +2

      That sounds awesome.

  • @koubbe
    @koubbe 5 месяцев назад +66

    I’m concerned about privacy when Cody “addresses” my entire codebase.

    • @ajc4000
      @ajc4000 5 месяцев назад +8

      Yeah, Alex probably gave up his entire work to Cody, OpenAI etc. by uploading it…

    • @jasonlitson
      @jasonlitson 4 месяца назад +10

      Alex did mention that his projects are open source.

    • @ajc4000
      @ajc4000 4 месяца назад +3

      @@jasonlitsonfair enough. I still don’t don’t like the whole development (data is money)

    • @codechapin
      @codechapin 3 месяца назад +1

      @@jasonlitson open source projects have different licenses. Are those services adhering to the license of the project?

    • @kipchickensout
      @kipchickensout 2 месяца назад

      I don't mind with open source but yeah what if I want to try it at work

  • @LukeBarousse
    @LukeBarousse 5 месяцев назад +75

    At the rate we are going, I think it will only be a few years to where Local LLMs will be "good" enough coding assistants
    (hopefully that will end subscription costs 🤷🏼‍♂)

    • @SahilP2648
      @SahilP2648 5 месяцев назад +7

      On my 64GB M2 Max MBP running phind-codellama:34b already gives me performance comparable to GPT-3.5 or even better, but maybe not on the level of GPT-4 just yet. Imo it will take maybe 6 months to be able to run on local hardware. Imo a 96GB or 128GB m1 or m2 ultra should be more than enough to run this. String this with a git hub project that creates a webui on local host, host that on ngrok, and you can use it anywhere. I have done exactly that and it's spectacular. Better to run LLMs locally for confidential code. I used the ollama git hub project btw, which also has projects linked in the description.

    • @user-ob7fd8hv4t
      @user-ob7fd8hv4t 5 месяцев назад +1

      @SahilP2648 Have you tried deploying other LLMs on-premises, or even fine-tuning them? I'm thinking about buying a 96GB m2 max, I want to try fine-tuning some LLMs locally, and now this mbp has some deals and it's very portable (which I value a lot)

    • @SahilP2648
      @SahilP2648 5 месяцев назад

      @@user-ob7fd8hv4t I haven't tried fine tuning as I don't have the need to and I am not an ML researcher (Android app developer). But it might be faster on a 4090. The main problem is VRAM which Macs have plenty of, if you can shell money. I have tried plenty of models and for my purposes I can get away with a simple system prompt change. Inference speed is quite good too.

    • @HaseebHeaven
      @HaseebHeaven 5 месяцев назад

      But need much powerful machines thats the issue

    • @fireninja8250
      @fireninja8250 4 месяца назад

      @@user-ob7fd8hv4t Finetuning requires about 5-10x the vram than running it. Just keep that in mind

  • @user-uk1bx9vm4o
    @user-uk1bx9vm4o 5 месяцев назад +12

    I am glad that I subscribed to you.

    • @AZisk
      @AZisk  5 месяцев назад +1

      Thanks for subbing!

  • @coinboybit7281
    @coinboybit7281 3 месяца назад +1

    You can actually correct to local llm already. I connected to the code llama and it works well

  • @camsand6109
    @camsand6109 5 месяцев назад +6

    In theory, if your local LLM application presents an open ai compatible api, you could that in combination with something like cody. Suppose even if it was locked down, you could edit your hosts file and point open api servers to your local llm.

    • @SuilujChannel
      @SuilujChannel 4 месяца назад +1

      good idea with the host file! :) of course this only works if the requests do not get proxied through some cody endpoint first (i dont know if they are)

    • @InnocentiusLacrimosa
      @InnocentiusLacrimosa 4 месяца назад

      Now that is some original thinking 🙃

  • @burrowsforge3538
    @burrowsforge3538 4 месяца назад +8

    I wouldn’t bet that they will do local LLMs to the free tier because it may discourage people from purchasing a premium tier if they can just run it using their own model locally. The free tier gives you a “taste” and the artificial limit is to encourage you to buy the premium tier. I do like that you can try before you buy

    • @c0t1
      @c0t1 3 месяца назад +1

      Sounds like we need an open-source "Cody" that runs on local LLMs. That would be awesome.

    • @ColeStarkPieces
      @ColeStarkPieces 2 месяца назад +1

      Pieces for Developers does this for free!

  • @sreeharisreelal
    @sreeharisreelal 4 месяца назад +1

    Can you make a video comparing the 18GB M3 Pro MacBook Pro to the 36GB M3 Pro MacBook Pro with regards to developers' needs, also including VM?

  • @gamerbath7921
    @gamerbath7921 4 месяца назад +1

    Any idea what LLM they are using? I can't find any information about it on their website.

  • @remigoldbach9608
    @remigoldbach9608 5 месяцев назад +1

    Great video as always !

    • @AZisk
      @AZisk  5 месяцев назад

      thanks 🙏

  • @arozendojr
    @arozendojr 4 месяца назад

    You can compare MacBook air M1 8gb RAM VS MacBook pro M3 18gb RAM, I want to identify if it is really worth the upgrade

  • @exnozgaming5657
    @exnozgaming5657 4 месяца назад

    They also support Mixtral 8x7b

  • @s.mahdihosseini1933
    @s.mahdihosseini1933 11 дней назад

    you are a man from heaven 😮
    that was my concern bro

  • @shalomrutere2649
    @shalomrutere2649 5 месяцев назад +1

    The new 14th Gen Intel processors for laptops just got released. Hope to see you get one🙋

  • @softwareengineeringwithkoushik
    @softwareengineeringwithkoushik 5 месяцев назад +5

    Always love to watch your videos.... Love from India

  • @j.r.r.tolkien8724
    @j.r.r.tolkien8724 4 месяца назад

    I've used it. It's great.

  • @justingarcia500
    @justingarcia500 5 месяцев назад

    What max are you using in the video?

  • @fuzzy-02
    @fuzzy-02 4 месяца назад

    Wish you addressed this rate limit more

  • @leoingson
    @leoingson 5 месяцев назад +3

    The UI is a bit scattered. Claude 2 in the free version, for others have to go pro. If you index your project, it gets uploaded (not sure what the other full-project coding AIs do there).

    • @leoingson
      @leoingson 5 месяцев назад +1

      Actually, without that video, the UI would have been quite confusing. Like choosing LLM + indexing project are almost impossible to find.

    • @user-gy1dh3kq9p
      @user-gy1dh3kq9p 5 месяцев назад +2

      Pro is free to 2024, so what are you talking about?

    • @leoingson
      @leoingson 4 месяца назад

      @@user-gy1dh3kq9p Wasn't obvious, thx, indeed works until end of january!

  • @DinoFancellu
    @DinoFancellu 3 месяца назад +1

    Hmm, I'm on code pro and intellij, but can't see any place to change llm

  • @xXWillyxWonkaXx
    @xXWillyxWonkaXx 2 месяца назад

    is there a way to run local llm with codey? a work around?

  • @Ceznex
    @Ceznex 3 дня назад

    What do you think about codeium?

  • @khristal000
    @khristal000 5 месяцев назад +15

    Liking this video so he can buy some not so cheap car

    • @AZisk
      @AZisk  5 месяцев назад +20

      Thanks for the like! I don't need an expensive car, just need to save for the next MacBook so I can review it here.

    • @vatsalyavigyaverma5494
      @vatsalyavigyaverma5494 5 месяцев назад +1

      Lmao

  • @damvcoool
    @damvcoool 3 месяца назад

    I am sure you are already aware, but there is infect a "version" of cody that can use local LLM, it's called collama.
    To be honest, I have not compared it to cody, but at a basic level it seems ok.

  • @MarkDLion
    @MarkDLion 4 месяца назад

    Does a viable local version of similar technology exist?

  • @simonj.k.pedersen81
    @simonj.k.pedersen81 4 месяца назад

    Why does this cost money, it is just a extension for vscode that calls some apis (or are they actually paying for the API consumption, I guess that would explain the rate limits in the free version)?

  • @JuanFmTech
    @JuanFmTech 5 месяцев назад

    How is copilot free for open source contributors? Is that a GitHub program?

  • @scosee2u
    @scosee2u 5 месяцев назад +1

    Great video, local llm ftw!

    • @AZisk
      @AZisk  5 месяцев назад

      Thanks!

  • @anjansingh3093
    @anjansingh3093 4 месяца назад +1

    do you think a i7 1355 U with 32GB RAM and 1 TB laptop is fine for learning these AI and LLM stuffs or a Macbook pro is better.

    • @digigoliath
      @digigoliath 4 месяца назад

      And NVIDIA RX 3060 family or higher with 8 / 12 GB RAM?

  • @saikrishnaa1374
    @saikrishnaa1374 4 месяца назад

    hello, i am facing a slight electric shock like sensation on the top of lid of MacBook while charging with the given power adaptor itself and tried this in multiple sockets and outlets still the same, people over internet are saying this as tingling and it is normal, but it is causing discomfort to use the machine while charging, my question is, is everyone who have 2 prongs adaptor faces this issue in countries where grounding is done with a third prong ?

  • @RedDragon72q
    @RedDragon72q 3 месяца назад

    Here is an interesting idea. I wonder if you can train Cody on specific engines. For instance Train Cody on the Unreal engine. Hmmmm

  • @bennguyen1313
    @bennguyen1313 3 месяца назад

    Any thoughts on how Sourcegraph AI's Cody compares to Perplexity AI or CodiumAI?

  • @armynyus9123
    @armynyus9123 Месяц назад

    why didn't you type that /smell over you code base? ;-)

  • @Ghost_Hybrid
    @Ghost_Hybrid 5 месяцев назад +14

    As a rule: if it's free, you are the product.

    • @doemaeries
      @doemaeries 4 месяца назад +2

      I guess in this case you pay with your code and they use it to further train their llms

    • @EdKolis
      @EdKolis 4 месяца назад

      And do what else with it? How do you know it won't wind up in someone else's suggestions verbatim?

    • @maevwat
      @maevwat 4 месяца назад

      @@EdKolisthere is nothing wrong with that, even when you are using copilot/chatgpt, you are using someone elses code, the problem is when your private info like api keys end up in their llms

    • @EdKolis
      @EdKolis 4 месяца назад

      @@maevwat what if your code contains trade secrets?

  • @rayband3826
    @rayband3826 5 месяцев назад +1

    If I'm just starting off on learning programming, is it worth it to install something like this? Or is it more detrimental to learning to have help like this?

    • @VforVanish
      @VforVanish 5 месяцев назад +2

      It can allow you to learrn faster so i think its worth.

    • @AZisk
      @AZisk  5 месяцев назад +4

      this needs to be part of your learning now

    • @rayband3826
      @rayband3826 5 месяцев назад

      Thanks for the answer! @@AZisk

    • @rayband3826
      @rayband3826 5 месяцев назад

      That would be neat. Thank you.@@VforVanish

    • @XiaolinDraconis
      @XiaolinDraconis 4 месяца назад

      I'm not a coder by any means, but I've written apps for myself several times by searching through wiki help articles and example scripts etc. Similar concept, I didn't go to school or learn anything crazy, just got the feel for it enough to build something simple and that alone was enough to help me understand a little when looking at stuff like this.

  • @v-for-victory
    @v-for-victory 4 месяца назад +4

    No worries, you pay for it with your data.

  • @chuff_daddy
    @chuff_daddy 5 месяцев назад +1

    thumbnail goes crazy

    • @AZisk
      @AZisk  5 месяцев назад +1

      is that good or bad?

    • @chuff_daddy
      @chuff_daddy 5 месяцев назад +1

      @@AZisk excellent in my opinion. love your videos man!

    • @AZisk
      @AZisk  5 месяцев назад

      thanks 🙏

  • @golmatol6537
    @golmatol6537 2 месяца назад

    So how do we know cody is not saving (stealing) our code somewhere else ? I would likely be fired if my boss knew I was using unauthorized coding assistants.

  • @filmyguyyt
    @filmyguyyt 4 месяца назад +1

    pieces is better yk i think

  • @MeinDeutschkurs
    @MeinDeutschkurs 5 месяцев назад +1

    Why should they implement local models? They want to make money with it.

  • @thiagoassisfernandes
    @thiagoassisfernandes 4 месяца назад

    Checkout codeium next!

  • @moinulmoin
    @moinulmoin 4 месяца назад +1

    my favorite one is CODEIUM AI :)

    • @freqtion
      @freqtion 4 месяца назад +1

      vs cody, which one is better ?

    • @moinulmoin
      @moinulmoin 4 месяца назад

      @@freqtion codeium

  • @bernRA
    @bernRA 5 месяцев назад

    uh oh. uh oh Alex? yes Alex. Alex? uh oh Alex. is that clippy Alex?

  • @wilsonjevin
    @wilsonjevin 4 месяца назад

    You had me at free

  •  3 месяца назад

    One indexation and LLM has all your logic, passwords, settings, database names, structures etc which are stored in project. Too risky to run. Especially on commercial projects.

  • @angstrom1058
    @angstrom1058 3 месяца назад +1

    You best be better than the "coding assistant" else you are done in a coding career. Any idiot can just use the "coding assistant". The good news is, the coding assistants still make wrong/bad/crap code, full of errors and implementations that may work, but only later you realize they don't work the right way.

  • @notoriouslycuriouswombat
    @notoriouslycuriouswombat 4 месяца назад

    Cody is now Generally Available, and Cody Pro is free until February 2024.

  • @mworld
    @mworld 4 месяца назад

    It's free because you are the product.

  • @stephenvalente3296
    @stephenvalente3296 5 месяцев назад +2

    Only totally free until February 2024, then $9/month for unlimited completions over 500 completions in it's free form.

  • @addy7445
    @addy7445 4 месяца назад

    Yeah I get where the hype is coming from, I'm pretty sure Cody will dethrone Roman at Wrestlemania.

  • @watchchat
    @watchchat 5 месяцев назад

    It’s free because your code is their code 😆

  • @z.8477
    @z.8477 2 месяца назад

    Free
    tier
    0 / 500
    Autocomplete suggestions
    this month
    0 / 20
    Chat messages and commands
    this month

  • @LarsRyeJeppesen
    @LarsRyeJeppesen 4 месяца назад

    Have to say DuetAI made me jump ship from CoPilot.

    • @gtarptv_
      @gtarptv_ 4 месяца назад

      Can you tell me why? Duet AI seems to be very unaware of my code , It seems i have to give it alot of context for it spit something good back

  • @aghileslounis
    @aghileslounis 5 месяцев назад +4

    For the ones that don't know, IT IS NOT AWARE of your ENTIRE code base. Nope
    It is just vector embeddings, to form a context when you ask it something, then it adds the context to your query before sending it to the LLM.
    Nothing crazy
    for a LLM to be aware of your entire code base, you need to finetune it! If Cody offers a way to finetune an LLM on your repo, then it will be true, it will know everything about your repo

    • @leoingson
      @leoingson 5 месяцев назад

      Claude2 has 128k context, so..

    • @dotinsideacircle
      @dotinsideacircle 4 месяца назад

      GP4 Turbo is 128 and Claude2 is 100. If you want better, Claude 2.1 is 200. @@leoingson

  • @ckhomphzxspaul8455
    @ckhomphzxspaul8455 5 месяцев назад +1

    Why this guy is weirdly attractive🥰

  • @SeanBotha
    @SeanBotha 4 месяца назад

    Weakest point. It is not available in visual studio. Since I use vs rather than vs code this is a no go for me. Shame would have liked to try it

  • @Apenschi
    @Apenschi 5 месяцев назад +1

    I expect Googles upcoming coding assistent to be the best and COMPLETELY free!

    • @sushicommander
      @sushicommander 5 месяцев назад

      They’ve already announced that it will cost $19 a month and you can only sign up annually.

    • @JuanFmTech
      @JuanFmTech 5 месяцев назад +1

      Why would it be free?

    • @soloflo
      @soloflo 5 месяцев назад

      I checked Bard (using Gemini Pro now) against ChatGPT 4 and Bard made some very incorrect statements (no pun intended) regarding C++ code I pasted. Very incorrect and even fundamentally wrong stuff that even novice programmers will see.

    • @Apenschi
      @Apenschi 5 месяцев назад

      @@sushicommander Ok, disappointing! And even more expensive than CoPilot??!? Do you have a link?

    • @Apenschi
      @Apenschi 5 месяцев назад +1

      @@JuanFmTech The rest of the dev tools are also free and if it runs locally, I see no reason to charge for it!

  • @EntangleIT
    @EntangleIT 5 месяцев назад +3

    moar! lol🙀

  • @Mr-Yuki
    @Mr-Yuki 4 месяца назад

    500 Autocompletions per month, so it's not technically free, since those can all be used in a day, nice video tho!

  • @ShreshthTiwari
    @ShreshthTiwari 3 месяца назад

    Codeium

  • @chairphobic
    @chairphobic 4 месяца назад

    i thought codeium was better

  • @StefanoV827
    @StefanoV827 4 месяца назад

    Not free anymore. 500 Autocompletions per month on free tier, then 9$/month

    • @AZisk
      @AZisk  4 месяца назад +1

      did you say it’s not free and free tier in the same sentence? 😆

    • @StefanoV827
      @StefanoV827 4 месяца назад +1

      @@AZisk ok, "not Unlimited anymore" ...

  • @tails_the_god
    @tails_the_god 4 месяца назад

    i am notorious for stinky code

    • @AZisk
      @AZisk  4 месяца назад +1

      me too :)

  • @D_Analyst007
    @D_Analyst007 4 месяца назад +1

    nothing is free , they use your coding data to train their models :)

  • @lolo2lolo491
    @lolo2lolo491 2 месяца назад

    New sub here and like your video but i dont like how you all the time do these zooming in effect like at min 1.3 and jumping frames all the time... These effect give me headache and i start to lose focus on the video and lose interest to learn something from it.. could you please stop doing that in future videos ?

  • @daniels-mo9ol
    @daniels-mo9ol 4 месяца назад

    Have fun with all your broken code that the AI hallucinated was "secure" and "optimal".

    • @AZisk
      @AZisk  4 месяца назад

      if one has fun coding, one has fun reviewing code. always review your code and ai generated code

  • @btogkas1
    @btogkas1 3 месяца назад

    Free but limited

  • @nukesean
    @nukesean 4 месяца назад

    You should use an LLM to come up with a video title that isn’t so cringe and clickbaity. Such a RUclips cliché.

  • @Nubbley
    @Nubbley 27 дней назад

    Ai is a tool to teach you code with no teacher available. If you know code, don’t use ai?

  • @user-tl7df2tn9l
    @user-tl7df2tn9l 4 месяца назад

    your video is TOO LOUD

  • @JakubSK
    @JakubSK 4 месяца назад

    I don't like it. Too black-box.

  • @brytonkalyi277
    @brytonkalyi277 4 месяца назад

    *\• I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it knows only things of the flesh which are our fleshly desires and cannot comprehend things of the spirit such as true love and eternal joy that comes from obeying God's Word. Man is a spirit and has a soul but lives in a body which is flesh. When you go to bed it is the flesh that sleeps, but your spirit never sleeps and that is why you have dreams, unless you have died in peace physically. More so, true love that endures and last is a thing of the heart. When I say 'heart', I mean 'spirit'. But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons those are things of the flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. However, true love works in conjunction with other spiritual forces such as patience and faith - in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever. To avoid sin and error which leads to the death of your body and your spirit-soul in hell fire (second death), you must make God's Word the standard for your life, not AI. If not, God will let you face AI on your own (with your own strength) and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. The prove text can be found in the book of John 5:31-44, 2 Thessalonians 2:1-12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21.
    *HOW TO MAKE GOD'S WORD THE STANDARD FOR YOUR LIFE?*
    You must read your Bible slowly, attentively and repeatedly, having this in mind that Christianity is not a religion but a Love relationship. It is measured by the love you have for God and the love you have for your neighbor. Matthew 5:13 says, "You are the salt of the earth; but if the salt loses its flavor, how shall it be seasoned? It is then good for nothing but to be thrown out and trampled underfoot by men." Our spirits can only be purified while in the body (while on earth) but after death anything unpurified (unclean) cannot enter Heaven Gates. Blessed are the pure in heart, for they shall see God (Matthew 5:8). No one in his right mind can risk or even bare to put anything rotten into his body nor put the rotten thing closer to the those which are not rotten. Sin makes the heart unclean but you can ask God to forgive you, to save your soul, to cleanse you of your sin, to purify your heart by the blood of His Son, our Lord and Savior, Jesus Christ which He shed here on earth - "But He was wounded for our transgressions, He was bruised for our iniquities; the chastisement for our peace was upon Him, and by His stripes we are healed", Isaiah 53:5. Meditation in the Word of God is a visit to God because God is in His Word. We know God through His Word because the Word He speaks represent His heart's desires. Meditation is a thing of the heart, not a thing of the mind. Thinking is lower level while meditation is upper level. You think of your problems, your troubles but inorder to meditate, you must let go of your own will, your own desires, your own ways and let the Word you read prevail over thinking process by thinking of it more and more, until the Word gets into your blood and gains supremacy over you. That is when meditation comes - naturally without forcing yourself, turning the Word over and over in your heart. You can be having a conversation with someone while meditating in your heart - saying 'Thank you, Jesus...' over and over in your heart. But it is hard to meditate when you haven't let go of offence and past hurts. Your pain of the past, leave it for God, don't worry yourself, Jesus is alive, you can face tomorrow, He understands what you are passing through today. Begin to meditate on this prayer day and night (in all that you do), "Lord take more of me and give me more of you. Give me more of your holiness, faithfulness, obedience, self-control, purity, humility, love, goodness, kindness, joy, patience, forgiveness, wisdom, understanding, calmness, perseverance... Make me a channel of shinning light where there is darkness, a channel of pardon where there is injury, a channel of love where there is hatred, a channel of humility where there is pride..." The Word of God becomes a part of us by meditation, not by saying words but spirit prayer (prayer from the heart). When the Word becomes a part of you, it will by its very nature influence your conduct and behavior. Your bad habits, you will no longer have the urge to do them. You will think differently, dream differently, act differently and talk differently - if something does not qualify for meditation, it does not qualify for conversation.
    *THE BATTLE BETWEEN LIGHT AND DARKNESS (GOOD AND EVIL)*
    Heaven is God's throne and the dwelling place for God's angels and the saints. Hell was meant for the devil (satan) and the fallen angels. Those who torture the souls in hell are demons (unclean spirits). Man's spirit is a free moral agent. You can either yield yourself to God or to the devil because God has given us discretion. If one thinks he possesses only his own spirit, he is lying to himself and he is already in the dark. God is light while the devil is darkness. Light (Holy Spirit) and darkness (evil spirit) cannot stay together in a man's body. God is Love (Love is light) and where there is no love is hell, just as where there is no light is darkness. The one you yield yourself to, you will get his reward. The reward of righteousness to man's spirit is life (abundant life) and the reward of sin to man's spirit is death. Sin and satan are one and the same. Whatever sin can cause, satan also can cause. Sin is what gives the devil dominion or power over man's spirit. When God's Word becomes a part of you, sin power over you is broken, you become the righteousness of God through Christ Jesus. Where Jesus is, you are and when He went (to the Father), you went. In the book of John 8:42-47, Jesus said to them, “If God were your Father, you would love Me, for I proceeded forth and came from God; nor have I come of Myself, but He sent Me. Why do you not understand My speech? Because you are not able to listen to My word. You are of your father the devil, and the desires of your father you want to do. He was a murderer from the beginning, and does not stand in the truth, because there is no truth in him. When he speaks a lie, he speaks from his own resources, for he is a liar and the father of it. Which of you convicts Me of sin? And if I tell the truth, why do you not believe Me? He who is of God hears God’s words; therefore you do not hear, because you are not of God.” May God bless His Word in the midst of our hearts. Glory and honour be to God our Father, our Lord and Savior Jesus Christ and our Helper the Holy Spirit. Watch and pray!... Thank you for your time and may God bless you as you share this message with others.

  • @firstlast493
    @firstlast493 4 месяца назад

    500 auto suggestions per month 👎

  • @irrelevantdata
    @irrelevantdata 5 месяцев назад +1

    How is it free?! It lets you ask it something like 10-20 questions a MONTH, and some stupid low number of completions. You have to pay to just use the thing. What a rip off.