Комментарии •

  • @CodingTutorialsAreGo
    @CodingTutorialsAreGo 4 года назад +1

    Source code available at: github.com/JasperKent/InMemoryDatabase-Best-Practices

  • @olegsuprun7590
    @olegsuprun7590 Год назад +2

    Hi, indeed the in-memory db helps us with unit testing, by removing a lot of boilerplate code and managing relations between entities for us. But I think SQLite in-memory will also give you additional safety in terms of enforcing entities to comply with the database schema, required fields, foreign key restrictions etc..., which will allow us to catch those errors in unit tests. What do you think about using SQLite in-memory db for unit-tests?

    • @CodingTutorialsAreGo
      @CodingTutorialsAreGo Год назад +1

      The problem as that the unit tests are there to test the code, not the DB configuration. If SQLite is just there for testing and you use (e.g.) SQL Server for production, then you have two problems:
      1) You're spending effort configuring SQLite just to run unit tests that verify you've configured SQLite correctly.
      2) If there are any discrepancies between how SQLite and SQLServer work (which there are) then you'll have to rerun your tests for SQLServer anyway.

  • @aanders0n
    @aanders0n 8 месяцев назад +1

    Is this still relevant for unit testing dbcontext in EF core now in 2024 with the repository pattern?
    And thank you our videos, they are all really interesting and educational.

    • @CodingTutorialsAreGo
      @CodingTutorialsAreGo 8 месяцев назад

      The problem with the repository pattern is that it just holds a single copy of the data in memory. Thus there is no need for eager loading, so you can't test that you've got your eager loading right.

  • @alexdevorigin1
    @alexdevorigin1 4 года назад

    Is this better than the local charge or normal revenue charge? the speed of the immemory and is higher? and if you implement mars would it be better?

    • @CodingTutorialsAreGo
      @CodingTutorialsAreGo 4 года назад +1

      I only use the In-Memory Database for testing, so performance isn't very important, beyond a basic threshold. In comparison with, say SQL Server, the In-Memory Database should be faster, but only because of the small amounts of data we use for testing and the connection overhead with SQL Server.

    • @alexdevorigin1
      @alexdevorigin1 4 года назад

      @@CodingTutorialsAreGo
      ok because I have a repository in my github where I have a project with good integrations but 3450 rows usually takes 40 seconds to load, but 10,000 or more almost 8 minutes in both sqlite or sqlserver, also mounting it to IIS takes twice as long, you recommend loading in Memory

    • @CodingTutorialsAreGo
      @CodingTutorialsAreGo 4 года назад

      @@alexdevorigin1 Excellent question!
      First thing to say is that 8 minutes to load 10,000 rows sounds VERY slow, so you may want to check whether there's a specific issue there.
      More generally though, we can't really consider relational database an In-Memory databases as alternatives for the same job. They do very different things.
      The primary purpose of a relational database such as SQL Server is to store data persistently - beyond the lifetime of a program. So if we are looking for alternatives we can look at different types of permanent database, or even things like flat files.
      In many situations we need to take data from a persistent store and manipulate it in memory (though not always; it's often worth considering doing manipulation within the database itself with stored procedures etc.) An In-Memory database is a conceivable way of working in memory, but the alternatives to it are the traditional in-memory structures such as arrays, List and Dictionaries.
      My instinct on this would be to use those traditional in-memory collections. There's not much additional that an In-Memory database can do, since we can do LINQ queries on collections. In fact, where an In-Memory database behaves differently from a collection, it tends to be a feature deliberately intended to make the In-Memory database harder to use - and therefore more like a real database. For example, the problems mentioned in the video around eager loading are there deliberately so that in our unit tests we get behaviour similar to a real database. With in-memory collections, we don't have to worry about eager loading.
      In terms of the other things In-Memory databases do, I think the auto-generation of identities is not really relevant. With in-memory data we don't need surrogate identities, since the object's memory address serves as its identity. Indeed, we'd need to be careful, since the Ids created by the In-Memory database would likely be different from those generated by the actual database when the data was committed to a permanent store.
      One thing collections don't do is automatically fix up reciprocal reference (e.g. if we create a reference from a parent to child, when we do SaveChanges(), the reference from the child to the parent will be generated automatically). That said, this kind of reciprocity is not something I find regularly used in an O-O program, most likely because developers have found ways around the issue.
      In the end, the only reason I could see to use an In-Memory database rather than collections is performance. I don't have any general figures, but it's always best to measure these things on specific data. My guess is the the In-Memory database will be slower, since it's doing the various additional things mentioned above.
      Ultimately though, your data must be transferred from the permanent store into your program's memory, and looking at you figures, that's where the bottleneck is in your case.
      Hope that helps.

    • @alexdevorigin1
      @alexdevorigin1 4 года назад +2

      @@CodingTutorialsAreGo It is correct, apparently I found the error but I do not know how to solve it, that is, to solve the problem is to paginate and filter the table where the data is loaded since when loading all at the time is where the slowness occurs, and searched for how to do it by means of javascript and html or bootstrap but it is negative it does not work with razor pages, looking at the official page they talk about it but it is necessary to implement the organization of this in the controller and it is something that I still do not know.
      docs.microsoft.com/en-us/aspnet/mvc/overview/getting-started/getting-started-with-ef-using-mvc/sorting-filtering-and-paging-with-the-entity-framework-in-an-asp-net-mvc-application
      the same but for mvc core

    • @CodingTutorialsAreGo
      @CodingTutorialsAreGo 4 года назад

      @@alexdevorigin1 Thanks again for the question - I thought it worth going into in a bit more depth, so here's a new video that looks into it: ruclips.net/video/uQCuSOIM0Kw/видео.html

  • @alexdevorigin1
    @alexdevorigin1 4 года назад +1

    Great