1:45 Github Page / README / Overview - github.com/hackingthemarkets/tradekit 4:38 Why Docker? 8:24 Packages Included 11:04 Running Locally, Dockerfile 16:25 More Library Discussion 31:45 Directory Structure, Examples 34:30 Web UI, Possible Examples 38:30 Database Schema - should one be included? 39:00 Running a command on a container 43:05 What are your thoughts/feedback? Is this useful? What should be part of a starter kit / common codebase? Follow me on Twitter - twitter.com/parttimelarry Buy me a Drink - buymeacoffee.com/parttimelarry
Thank you for putting this together. Usually I avoid docker but willing to give it a shot. Quantopian was great and zipline was so easy to use. I should have backed up my notebook before they ran to robinhood! Github version of Zipline added strict version build locked requirements now I can't get it to compile . Hopefully Alpaca stays independent.
Hi Larry. This type of content is unique on the whole internet. Im a software developer and I can tell you there is no content like yours. Im a java developer and with your videos I just developed a Trading Boot using all your ideas. I think it will be useful if you teach us how to combine MACD and RSI in a trading bot. In my opinion these two indicators traders offen use them combined. God bless you for sharing your knowledge. You deserve at least 1m suscribers. Regards from Mexico
I'm stoked for this, Larry! I'll be eagerly following along. It's the perfect project to take my algo trading to the next level. I think this series is going to blow up your channel, as you hoped. There's a demand for this kind of high quality content, and I think there's a movement towards democratising quant. You're at the forefront of it. Blessings to you, man!
Yes. This is perfect for teaching. I worked on an issue on Windows with PATH and switched to my Fedora laptop because it is easier to start development. Thanks for all you do Larry!
Larry, I've watched several of your videos and have to say that you are a great teacher and clearly know your stuff. I'm an experienced computer engineer with a CS degree from MIT and your channel has been super interesting to me and given me a lot of good ideas. I love the concept of this Docker Image to make it easy to launch into your projects with minimal setup required. One thing that I've been thinking about is I'd love to be able to build my own local database with historical stock quotes, as well as historical 10Q fundamental data so we can do both technical and fundamental analysis and backtesting. I'd also like to support multiple different data sources, so we can cross-check their data against each other and highlight any discrepancies. Great work on your channel! If you need anyone to help with anything, please let me know. Thanks!
Wow. This is the next level and goes over my head. We may need a separate tutorials for each topics. Can you make a video on how to make such awesome YT videos? Too good Larry!
Hey Larry, I'm a few years late in looking at your work in general, but I definitely appreciate your efforts and your delivery. Although I have a long background in both Financial Services and IT, I had no exposure previously to Python or Docker. I am retired at this point and very happy to still be learning something new. Great job - start to finish. This TradeKit project is an excellent idea; I wonder if you may have updates to this project?
Larry you have out done yourself my friend I watched the entire video and didn’t even realize it was a little less than an hour! I’m really excited to start using Docker I wanted to learn more about it but I didn’t quite have a project to use it on, now I do! Thank you so much!
Thanks this comment means a lot! I make really long form content and I am always trying to experiment and see what keeps people engaged. So many things going on in the world, to get anyone's attention for even 5 minutes is amazing. I am excited to expand upon the Docker image discussed here and build on it, glad to hear there is interest -- I think it will add a lot of value as we build more sophisticated applications.
I don't think you should trim your package list at all. Space is cheap and including a package doesn't degrade the performance if it isn't being called. You have a point about people having a broker preference or data source preference as 2 examples. Thanks for all the hard work you expended putting this together.
Dude, just discovered your channel. Youre the real g.o.a.t.. Can you do a video on your experience with trading overall and algo trading in particular. A little insight into your trading style and success/lessons learned would be really cool. Thanks for what you do!
Thanks for the awesome work! I was bogged in a dependency-hell and now I have solid ground to start from. Zipline maintenance/development has apparently continued under the name zipline-trader, which you might want to have a look at.
I like this direction, would definitely be interested in helping out as well. If your still up for feedback. I would say settle on certain tech and we can all just ask/push for alternatives and see where things go. This videos have fast track me to making my own trading ecosystem and has inspired me to push forward. ty
Larry, this is great I love this, I will actually use this. My suggestion on your question of the size and packages to include is maybe you leave the full repo like it is, with full functionality but create light weight images uploaded to docker hub with different themes like forex trading, crypto trading development etc.
EPIC sums it up. I have been thinking about how big the symbol table is. I would like a way to create multiple shorter lists. So a few more fields to narrow down what is processed. For example ETF and / or Sector are the first 2 that come to mind. Maybe a drop down in the UI to select the list. Time for me to watch a few docker videos and get up to speed so for my own version. Thanks as always.
Hi PartTimeLarry. I'm new to the your channel (great by the way) and the elements within TradeKit are new to me but I'm here to learn and create a Trading Bot along with a dev platform. Quick question re TradeKit - what happened to this project, no updates in a while? Did you drop the project as it seems not to be up with your current thinking i.e. backtesting (you seem to favour VectorBT, Alpaca (I have to use IBKR because of broker restrictions for the country I live in, not checked Alpaca) ? I like your approach to TradeKit, common set of tools for you to teach against, get your followers up to speed quickly and a deployment to any local or cloud platform all container-ized.
Hey Larry, thanks for the great container! Do you plan on continuing support for the Tradekit? I noticed that you haven't committed anything new to the repo?
Yay! PostgreSQL, also please show how to build a docker image. I was wondering, having one docker image that does all of the data collection and calculations and sending signals to another docker image that receives the signals, calculation results and only trades,. I have a small group of friends that will use the bot, each of their bots should store their own buy, sell, records, trade and do their own back testing. Is this something that can be done relatively easily?
Definitely an actual Docker tutorial to come. As far as your thoughts on having multiple containers communicate, here are my thoughts. So instead of packaging all of these up into one big image, I would create a smaller docker image just for data collection/calculation in this case. So this container would just maintain a connection to a data feed, and would have the data analysis libraries (eg. pandas, indicator libraries, machine learning, whatever), and be slimmed down and customized to only know how to do the work it needs to do. Then a separate small image could just contain the libraries and logic for connecting to a broker and executing trades. To communicate, there would just be a message queue. In a previous job, we used Amazon SQS. So the data collection / calculation container you discuss would say "oh I have a signal, let me tell the trading container", then would send a message to SQS (or other system, you could even write the records to a database and make a simple queue that way). The other container that only knows how to execute trades would run a process that polls the queue for new signal messages. You could even run multiple processes and make sure they are up and running using supervisord if you are going to be publishing tons of messages and need a bunch of workers doing various things in parallel. So when the workers consume/receive the messages, they do their work, then go right back to checking the queue for any new signal messages that were received. You also might run PostgreSQL on its own and allow both containers to connect to it if both systems need to store data/results.
@@parttimelarry I am not very familiar with Postgres (yet) however coincidently I was reading about Postgres "listen" and "notify" feature. I see that psycopg has async notifications feature. Instead of having extra services in between, wouldn't be faster and easier to maintain if we have it part of the app? If so, an actual walkthrough would be nice.
Seeing you use vscode and GitHub, you might want to look at the support for working with Dev containers (i.e. setting up a .devcontainers folder). More details here code.visualstudio.com/docs/remote/containers. It works with GitHub codespaces and for local development. You just open the folder in vscode and it starts the containers.
Hi Larry. This is a pretty cool thing you put together. I have a general question on the architecture of this whole thing. Supposed I write an app that collects that data from a specific exchange via websocket. Then I would dockerize it, compose via yml and let it collect and store the data. Then another container will be working this data. Would this be a proper set up (i.e., splitting data collection and data usage in two separate containers)? My understanding is that if I don't split it in two containers, every time I change my "data usage" and re-build it, I will need to reset the entire database and all my previously collected data will be gone. Is that right? What is the best way to handle it?
Hi larry, Have you thought about using CCXT package? might be a good suggestion that could potentially replace some of the packages that you use. cheers
Yes I have seen this and been meaning to try out, I agree there are some great live trading open source packages that would probably save me from reinventing the wheel :). Thanks, there are a few extra packages I have been meaning to add to this.
subcribed this channel for one month. Past episodes YT suggests everyday still impress me. Local quant classes charge more than $200 fora 4-hour tutorial. None of these lecturers know docker/container.
Looks amazing mate! I used to use jupyterlab a lot but only for setting up a pandas dataframe and manipulating it, it provides an easier way to visualize tables than the console of an IDE. Have never and would never use it for running or deploying an app, it's like you described just for research purposes mainly.
At minute 44 you mention that it might be too much, perhaps you could have a "lean-with-larry" branch and a "just-the-basics" branch or something like that. I know from my perspective, we aren't all experienced enough to know what we might want or not want, so more is better. I can't thank you enough for building this and creating this video, it was a missing link for me that I've been wanting to learn and explore but wasn't prepared for the million different rabbit holes that I would have to go down, here you just gave us one giant rabbit hole. As a side note, I notice you don't use Anaconda, but I find that a lot of newer python folks, self-included, were taught to start with anaconda, and so have grown accustomed to working with conda environments, etc. If I wanted to keep with that tradition, would I just fork your repo, then somehow add anaconda or anaconda lite to this container? (wouldn't quite know how to do that, of course) but, If so, then could I stay up to date by setting your repo as the upstream and pulling in changes to my remote (my github fork), then down to my local instance (the container), while in the container (assuming git is installed in the container)? I assume rebuilding the container every time would be massively inefficient. BTW, I may have butchered the terminology there.
Kudos to you for the approach taken. I hate that I am just getting to this video because I have been going through the videos to build out the Trading Bot and thought I would comb through and see where all of this is going and ran into this video. My feedback is WOW. I think you are definitely on to something and I want to share what I think picks up where you left off when it comes to the Database Structure. I call it the "Header Washer". The idea is to bring in any CSV or xls file with different header names and have it update the Header Names to work with the code. There would be a Master Header Alias to make the updates automatically in the background. I have already written the code in VBA to work with Excel. It would not be too challenging to convert it to Python and I would be happy to share it with you if you think it has value to your vision. Thanks for your contribution. Example of my code is lets say 5 people had a Contacts Table. Each of them had a 1st name column and a Last Name column. The 1st name column you have in your code is First_Name. The 1st name column the 5 clients have are FName, F_Name, FirstName, FN, and first name. Likewise with the 2nd Name there are 5 different ways to express the last name. The Master Alias would look for all of these different alias of each column and update it ultimately to First_Name in the Table that is used in the code. This way our code is never broken by unmapped data no matter what order the column is in the table. What if First_Name in the import table is in index 5 and not index 0, the code would automatically identify this and place that column of data into the proper index. It's a huge time saver for working with clean table data that may still need a bit of cleaning up before this is done for Nulls etc. but you would have a strong head start on getting things up and running from someone who is fresh to the Docker Image and wanted to engage with the content with there content. Kudos again and let me know where you are with this. I am also curious about the effort that went into the Trading Bot series you finished in Oct/Nov, did you scrap it and do something else? Did you build upon it. I am trying to find where the end of this goes and what the results are. I am thinking that you should be feeding the homeless and passing our free dinners to people all over the World by now. That's what I plan on doing, Amen. P.S. Is having a guitar in the background a RUclips backdrop thing, or do you really play?, lol Just curious. P.S.2 Have you given Prox Mox a look? It appears to take the container concept to a whole new level in my opinion. But, I am not a Ninja like you are on what that could offer and would be interested in what you thought about it.
Yes please slim it down. You mentioned trading options with Tradier, so maybe an example with "maybe" Octopy with for example in-out credit/debit spreads. Also basic install instructions for those new to linux. This looks like a very big project, so slimmed down will speed everything up
22:57 - IB Feedback: From my research, ib-insync is the way to go I believe. Unfortunately, IB requires you to use a Java applet to authenticate any API-based trading. Headless authentication isn't allowed so you have to have Trader Work Station or IB Gateway running in the background. The Gateway is the more lightweight option (memory-wise) since it exists simply to authenticate and faciliate API-based trading whereas TWS is a full desktop app with a GUI so people can click and trade. The second downside to IB is that the Gateway/TWS will intentionally log you off daily (I think it's daily, but possibly weekly). This is another so-called security mechanism. However, the upside to ib-insync is that it utilises IBC (IB Controller) which automates logging in to TWS/IBGW whenever you're logged out. Right now I'm looking at existing IBGW Docker images that sort this out. I've linked the two that look best to me below. All this is a bit of a pain but worth it for the low fees, high liquidity, international access, and access to a wide variety of markets. Resources: - ib-insync documentation: ib-insync.readthedocs.io/api.html - Check out 'Notebooks' and 'Recipes' for some good examples of how simple the code can be. - User group with lots of helpful info: groups.io/g/insync/ - IBC: github.com/IbcAlpha/IBC - This has superseded the old 'IBController'. - IBC is packaged with ib-insync. ib-insync docu on using IBC: ib-insync.readthedocs.io/api.html#ibc - IB Gateway Docker Containers - github.com/dvasdekis/ib-gateway-docker-gcp - github.com/manhinhang/ib-gateway-docker
Since you like to build apps, how about a little GUI or CLI application that lets the user build their custom image of tradekit by selecting checkboxes of the tools and libs they'd like to be part of the final image?
Are you going to put all in one docker image or going to modularize it in various containers and create a docker-compose file? If possible can you also let us know how this can be deployed to platform like Jelastic Cloud?
What about security? I'd be careful with using API keys of any brokers (that have access to my real money accounts) in some Docker image with a bunch of dev tools and libraries? Also I would not advise to use such an image for a deployment into production - too big attack surface. I hope you will provide a warning when you publish this image on Docker Hub. Other than that I really enjoyed all the videos from you I watch so far. Thank you!
You can dump the Flask ad Jupyter from the image. Are you going to use NGINX to get the webserver on a routable IP address so this will be available from the web?
Larry, this looks awesome! The more plug and play you make it the better, so that everyone can focus on strategies. Maybe keep a slimmed down version separate and base it on feedback plus your opinion. What is the installed size of this?
Agreed, I have realized the ML libraries are quite large and most people probably won't even use them. Will see about splitting this up. Thanks for the feedback.
Hello, hopefully it was clear I am collecting feedback at the moment. Will be adding some packages, freezing versions, and writing some docs. Will be updating requirements.txt
Windows users should be using WSL2, not native Windows environment, for nodejs or python development. This gives Windows user a Linux environment specially integrated with the Windows OS. Even the official Microsoft docs recommend this configuration. Yes, you can use only native Windows environment, but many times you’ll hit compatibility issues with packages. This docker solution is really good though too. Vscode can run remote against wsl2 or Docker or ssh.
Thanks for the feedback. Still re-familiarizing myself with the latest in Windows since I haven't used it as my primary OS in a while. Docker required me to set up the Linux subsystem, but it's been a while since I have used it. I should use my Surface Book more :).
@@parttimelarry Hi Larry. Yes I'm getting an error at step 10 / 10 when it start collecting and downloading scipy. Some sort of exception error. Any ideas how I address this?
well thought out, good preparation, great execution. as u say, prob too many libs now? but really depends on the purpose, if next a "project" maybe less, if next "loose examples" then more now is ok? but I guess as the old saying goes, at the end of the day ... less is more! imho couple locks in some technology is a must, on postgres, on debian, sqlite, redis, talib, etc. if u ever consider UI into mobile too, imho consider a peek at Flutter. ps I like using pandas-ta, but it really is "just" another flavor of talib. congratz Larry
Thanks for the feedback here, agree with your points on locking the versions and possibly being a bit more selective and having different requirements.txt files for different "flavors". From the feedback I received so far, including all of the ML stuff slows things down quite a bit since those packages are huge. Including some lightweight indicator and backtesting libraries doesn't add much bloat though. Also could be good to just have a different config for each broker.
Great tutorial! Thank you for all you do. I have a question for you or the community, though, as I am a bit stuck. I was able to get Debian running in Docker, but when I enter psql -U postgres, I get this error: "psql: error: FATAL: role 'postgres' does not exist" I made sure postgres was installed and that postgres was the default user, but I continue to get the same error. Anyone have any ideas as to what I might be doing wrong?
@parttimelarry I hope it is not too much trouble for you to update the TradeKit collection - it has been almost 2 years since it was released Many Thanks in advance :)
Agreed, I think it would be cool to build out the landing page UI and the Github README and have it not only provide a set of interactive examples that can be edited and run, but also linked up to additional resources and documentation. And then over time it becomes a really cool starter kit for learning and developing new things collaboratively.
@@parttimelarry You've mentioned creating a course, so maybe build out a course for the Docker content, contained in a folder of HTML pages in the Docker image - then everything is "packaged to go." I would gladly pay for something like that.
I haven't thought about this yet, mostly because I'm not so familiar with TradeStation (although I have read a book that uses EasyLanguage). Doesn't TradeStation mostly assume you are going all-in on their platform rather than creating your own tools/infrastructure? I don't really know the platform yet.
@@parttimelarry Well, I haven't really dug very deep to be honest. But, I know they are integrated with TradingView (which I imagine is by API) and there is an API available => www.tradestation.com/platforms-and-tools/web-api/
@@mattsterba Awesome, thanks for the link. I have been gradually learning about what platforms everyone uses to trade, and TradeStation does come up a lot, so I need to learn more about it. Hopefully I can provide some value here as well!
Do you (or anyone) know how (or if it's even possible) I could do something like the following decorator that I created to wrap asyncpg DB fetches with a connection from the pool and a transaction block? I'm working on a DB helper class and don't want to have to keep writing with statements for every function. I'd like to end up with an abstract function that accepts "sqlStatment" as an argument and then that function be decorated with the "@_connectionTransaction" that I've created to try and abstract away the pool/transaction and keep things DRY. Any help would be much appreciated
Full helper class if it's any use in finding an answer (or helping others)... import sys, asyncpg from pandas import DataFrame from config import DB_LOGGING_REF from .definitionsAndQueries import queryStockSymbolsAndIds, dataFrameColumnsStockSymbolsAndIds class AsyncTimescaleDB: """Asynchronous TimescaleDB (PostgreSQL extension) class""" def __init__(self, DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME, LOG): self._host = DB_HOST self._username = DB_USERNAME self._password = DB_PASSWORD self._port = DB_PORT self._dbName = DB_NAME self._LOG = LOG self._connectionPool = None async def _init_async(self): """Asynchronous function initializations for this class""" await self._createConnectionPool() async def _createConnectionPool(self): """Creates a pool of database connections for asynchronous usage""" if self._connectionPool is None: try: self._connectionPool = await asyncpg.create_pool( host=self._host, port=self._port, user=self._username, password=self._password, database=self._dbName, command_timeout=120 ) except Exception as error: self._LOG.fatal('Problem creating database connection pool: ', error) sys.exit(1) self._LOG.info(f"Successfully created database connection pool: {DB_LOGGING_REF}") def _connectionTransaction(self, func): """Retrieves a database connection from the pool and establishes a transaction block""" try: async def _connectionTransactionWrapper(self, *args, **kwargs): async with self._connectionPool.acquire() as _connection: async with _connection.transaction(): # Database interaction function that we wrap with a connection and transaction return await func(_connection, *args, **kwargs) return _connectionTransactionWrapper except Exception: self._LOG.error('Problem with connection pool or transaction layer', exc_info=True) @_connectionTransaction async def stockSymbolsAndIds(self, _connection): """Fetches stock symbols and IDs (linking to historical data table) from database""" try: resultsRecords = await _connection.fetch(queryStockSymbolsAndIds) return DataFrame(resultsRecords, columns=dataFrameColumnsStockSymbolsAndIds) except Exception: self._LOG.error('Problem fetching stock symbols and IDs: ', exc_info=True)
Thanks for your posts here, still need to catch up :). Will try out some of your techniques soon since I am trying out TimeScaleDB this week and your async code will come in handy.
@@parttimelarry No worries! Don't forget the class will need instantiating differently due to the async method that needs to happen at the initialisation stage. I've linked an example below. Also, you should consider shifting from requests to aiohttp, they work very similar but you won't get the blocking as with requests. # main.py # ----------------------------------- import yaml, aiohttp, asyncio, logging.config from database.AsyncDatabase import AsyncDatabase from client.AsyncHTTP import AsyncHTTP from database.StockData import StockData from config import APP_NAME, LOG_CONFIG_PATH from config import DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME async def main(): asyncDatabase, LOG = await _init() async with aiohttp.ClientSession() as clientSession: # Initialize asynchronous HTTP client and data classes asyncHTTP = AsyncHTTP(clientSession, LOG) stockData = StockData(asyncDatabase, asyncHTTP, 24, '1min', LOG) # Obtain all data we are interested in await stockData.populateStocks() # await stockData.populateHistoricalStockPriceData() await _destroy(asyncDatabase, LOG) async def _init(): # Parse the logging configuration file (run once here, application singleton) with open(LOG_CONFIG_PATH) as f: logConfig = yaml.load(f, Loader=yaml.FullLoader) # Load the parsed config logging.config.dictConfig(logConfig) LOG = logging.getLogger("app") LOG.info(f"{APP_NAME}: Application started") # Initialise asyncDatabase asyncDatabase = await _createAsyncDatabase(LOG) return asyncDatabase, LOG async def _createAsyncDatabase(LOG): asyncDatabase = AsyncDatabase(DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME, LOG) # Asynchronous class function initialization await asyncDatabase._initAsync() return asyncDatabase async def _destroy(asyncDatabase, LOG): await asyncDatabase._destroy() LOG.info(f"{APP_NAME}: Application exited successfully") # Application entry point if __name__ == "__main__": asyncio.get_event_loop().run_until_complete(main()) # AsyncDatabase.py # ------------------------------------------ import asyncpg from pandas import DataFrame from config import DB_LOGGING_REF from .definitionsAndQueries import ( queryCreateStocksTable, queryCreateHistoricalStockPricesAndIndicatorsTable, queryStockSymbolsAndIds, dataFrameColumnsStockSymbolsAndIds, queryCreateTimeScaleDbExtentsion, queryConvertHistoricalToHyperTable, queryInsertStocks ) class AsyncDatabase: """Asynchronous TimescaleDB (PostgreSQL extension) class""" def __init__(self, DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME, LOG): self._host = DB_HOST self._username = DB_USERNAME self._password = DB_PASSWORD self._port = DB_PORT self._dbName = DB_NAME self.LOG = LOG self._connectionPool = None async def _initAsync(self): """Asynchronous function initializations for this class""" await self._createConnectionPool() await self._createAllTablesIfNotExist() async def _createConnectionPool(self): """Creates a pool of database connections for asynchronous usage""" if self._connectionPool is None: try: self._connectionPool = await asyncpg.create_pool( host=self._host, port=self._port, user=self._username, password=self._password, database=self._dbName, command_timeout=300 ) except Exception: self.LOG.exception("Failed in creating database connection pool") else: self.LOG.info(f"Successfully created database connection pool: {DB_LOGGING_REF}") async def _destroy(self): """Class destruction process""" if self._connectionPool is not None: await self._connectionPool.close() def _connectionTransaction(functionBeingWrapped): """Wraps function with pool connection and transaction block""" async def _connectionTransactionWrapper(self, *args): try: async with self._connectionPool.acquire() as _connection: async with _connection.transaction(): # Database interaction function that we wrap with a connection and transaction return await functionBeingWrapped(self, _connection, *args) except Exception: self.LOG.exception('Connection pool or transaction error: ') return _connectionTransactionWrapper async def _abstractExecute(self, _connection, SQL): """Executes an SQL command (or commands)""" try: await _connection.execute(SQL) except Exception: self.LOG.exception(f"{SQL}
") async def _abstractFetchAsDataFrame(self, _connection, SQL, dataFrameColumns): """Fetches values from database based on the SQL statement argument""" try: resultsRecords = await _connection.fetch(SQL) except Exception: self.LOG.exception(f"\"{SQL}\"
") else: # Convert results from records to dataframe return DataFrame(resultsRecords, columns=dataFrameColumns) @staticmethod def _SQLStringConcatenator(stringList): """Combines SQL multiple strings to form one execute statement with ';' statement terminators""" return "; ".join(stringList) + ";" @_connectionTransaction async def _createAllTablesIfNotExist(self, _connection): """Creates all required tables within the database (if they do not already exist)""" try: executeString = self._SQLStringConcatenator([ queryCreateStocksTable,queryCreateHistoricalStockPricesAndIndicatorsTable, queryCreateTimeScaleDbExtentsion, queryConvertHistoricalToHyperTable ]) await self._abstractExecute( _connection, executeString) except Exception: self.LOG.exception("Failed in creating all database tables") else: self.LOG.info("Successfully created all database tables") @_connectionTransaction async def getStockSymbolsAndIds(self, _connection): """Fetches stock symbols and IDs (linking to historical data table) from database""" return await self._abstractFetchAsDataFrame( _connection, queryStockSymbolsAndIds, dataFrameColumnsStockSymbolsAndIds )
Hey man, can i contact you somewhere? I got an 800 line trading view pinescript which i wanna use to trade on the calls it makes via webhook urls, but i dont know anything about coding and i watched your videos and they didnt really get me anywhere so i dont know what to change in the script to make it work. Wanna help me out for 50 bucks or something?
1:45 Github Page / README / Overview - github.com/hackingthemarkets/tradekit
4:38 Why Docker?
8:24 Packages Included
11:04 Running Locally, Dockerfile
16:25 More Library Discussion
31:45 Directory Structure, Examples
34:30 Web UI, Possible Examples
38:30 Database Schema - should one be included?
39:00 Running a command on a container
43:05 What are your thoughts/feedback? Is this useful? What should be part of a starter kit / common codebase?
Follow me on Twitter - twitter.com/parttimelarry
Buy me a Drink - buymeacoffee.com/parttimelarry
Hi! Larry!!! Let's add Trading View charting library in your docker container? i have a code for this task. :-)
May I suggest coingecko and crypto.com wrapper libraries?
Nice work Larry and Happy New Year. Suggestions to add are the CCXT library and Numba.
Thank you for putting this together. Usually I avoid docker but willing to give it a shot.
Quantopian was great and zipline was so easy to use. I should have backed up my notebook before they ran to robinhood!
Github version of Zipline added strict version build locked requirements now I can't get it to compile . Hopefully Alpaca stays independent.
I'm truly hopeful that the RUclips algorithm rewards you for your epic level of preparation!
All should like this video and get more view
WOW! I'm so ready for this video. I've been looking for someone to help put it all from start to finish with explanations on the way. Awesome.
God damn! So that is what you were working on all this time that you were MIA! I'm buying calls on the channel reaching 50k by March next year 👌👌
lol best comment. by the way i'm a put seller. how many you want?
@@theverybestdev huh put seller 🙄 I'm a true wsb guy because stocks only go __😋
@@theverybestdev also I'd like as many calls as my 401k can buy 😋😋
Hi Larry. This type of content is unique on the whole internet. Im a software developer and I can tell you there is no content like yours. Im a java developer and with your videos I just developed a Trading Boot using all your ideas. I think it will be useful if you teach us how to combine MACD and RSI in a trading bot. In my opinion these two indicators traders offen use them combined.
God bless you for sharing your knowledge. You deserve at least 1m suscribers. Regards from Mexico
I'm stoked for this, Larry! I'll be eagerly following along. It's the perfect project to take my algo trading to the next level. I think this series is going to blow up your channel, as you hoped. There's a demand for this kind of high quality content, and I think there's a movement towards democratising quant. You're at the forefront of it. Blessings to you, man!
Thanks Part Time Larry you really knock it out of the park with your easy to understand videos and your cool way of teaching, makes it easy to watch.
Yes. This is perfect for teaching. I worked on an issue on Windows with PATH and switched to my Fedora laptop because it is easier to start development. Thanks for all you do Larry!
Larry posts a video...you upvote it and then watch 😊..great stuff man as usual
Larry, I've watched several of your videos and have to say that you are a great teacher and clearly know your stuff. I'm an experienced computer engineer with a CS degree from MIT and your channel has been super interesting to me and given me a lot of good ideas. I love the concept of this Docker Image to make it easy to launch into your projects with minimal setup required. One thing that I've been thinking about is I'd love to be able to build my own local database with historical stock quotes, as well as historical 10Q fundamental data so we can do both technical and fundamental analysis and backtesting. I'd also like to support multiple different data sources, so we can cross-check their data against each other and highlight any discrepancies. Great work on your channel! If you need anyone to help with anything, please let me know. Thanks!
Seriously amazing, you went above and beyond and I hope you know that.
Very new to this, i must say this has to be the easiest way to set it up. Mate your a legend, Thank you!
Congratulations for 10K plus !!
Larry, this is GREAT!! an intraday data downloader to a db will be very useful. Your channel is really awesome, thanks for everything
Thanks for the compliment! Glad you are enjoying it!
Best coding channel for traders hands down!
Thank you very much , second time I run docker and everything is success.
Wow. This is the next level and goes over my head. We may need a separate tutorials for each topics. Can you make a video on how to make such awesome YT videos? Too good Larry!
Agreed, will need to do a full Docker tutorial where I start from scratch.
fantastic job! happy to contribute. you are doing a great job with this channel and I hope you get to 1MM subscribers.
Hey Larry, I'm a few years late in looking at your work in general, but I definitely appreciate your efforts and your delivery. Although I have a long background in both Financial Services and IT, I had no exposure previously to Python or Docker. I am retired at this point and very happy to still be learning something new. Great job - start to finish.
This TradeKit project is an excellent idea; I wonder if you may have updates to this project?
Awesome!! Thanks a bunch for all of the useful resources you give us all, wish you the best for this new year Larry.
Larry you have out done yourself my friend I watched the entire video and didn’t even realize it was a little less than an hour! I’m really excited to start using Docker I wanted to learn more about it but I didn’t quite have a project to use it on, now I do! Thank you so much!
Thanks this comment means a lot! I make really long form content and I am always trying to experiment and see what keeps people engaged. So many things going on in the world, to get anyone's attention for even 5 minutes is amazing. I am excited to expand upon the Docker image discussed here and build on it, glad to hear there is interest -- I think it will add a lot of value as we build more sophisticated applications.
I don't think you should trim your package list at all. Space is cheap and including a package doesn't degrade the performance if it isn't being called. You have a point about people having a broker preference or data source preference as 2 examples.
Thanks for all the hard work you expended putting this together.
Would really love follow ups on this! Absolutely love the videos dude!
just wow breathtaking idea this is a major step
Dude, just discovered your channel. Youre the real g.o.a.t.. Can you do a video on your experience with trading overall and algo trading in particular. A little insight into your trading style and success/lessons learned would be really cool. Thanks for what you do!
This is just gold mate , big thanks to you !
Thanks, glad you liked it!
Thanks for the awesome work! I was bogged in a dependency-hell and now I have solid ground to start from. Zipline maintenance/development has apparently continued under the name zipline-trader, which you might want to have a look at.
Awesomeness!
I like this direction, would definitely be interested in helping out as well. If your still up for feedback. I would say settle on certain tech and we can all just ask/push for alternatives and see where things go. This videos have fast track me to making my own trading ecosystem and has inspired me to push forward. ty
Larry, this is great I love this, I will actually use this. My suggestion on your question of the size and packages to include is maybe you leave the full repo like it is, with full functionality but create light weight images uploaded to docker hub with different themes like forex trading, crypto trading development etc.
Bro Im so glad I found you! All my interest in one place Thank you for your time sir
EPIC sums it up. I have been thinking about how big the symbol table is. I would like a way to create multiple shorter lists. So a few more fields to narrow down what is processed. For example ETF and / or Sector are the first 2 that come to mind. Maybe a drop down in the UI to select the list. Time for me to watch a few docker videos and get up to speed so for my own version. Thanks as always.
Hi PartTimeLarry. I'm new to the your channel (great by the way) and the elements within TradeKit are new to me but I'm here to learn and create a Trading Bot along with a dev platform. Quick question re TradeKit - what happened to this project, no updates in a while? Did you drop the project as it seems not to be up with your current thinking i.e. backtesting (you seem to favour VectorBT, Alpaca (I have to use IBKR because of broker restrictions for the country I live in, not checked Alpaca) ?
I like your approach to TradeKit, common set of tools for you to teach against, get your followers up to speed quickly and a deployment to any local or cloud platform all container-ized.
Hey Larry, thanks for the great container! Do you plan on continuing support for the Tradekit? I noticed that you haven't committed anything new to the repo?
Really guys, not a single comment about Macallan, shame on you!
Brilliant video as always.
Yay! PostgreSQL, also please show how to build a docker image. I was wondering, having one docker image that does all of the data collection and calculations and sending signals to another docker image that receives the signals, calculation results and only trades,. I have a small group of friends that will use the bot, each of their bots should store their own buy, sell, records, trade and do their own back testing. Is this something that can be done relatively easily?
Definitely an actual Docker tutorial to come. As far as your thoughts on having multiple containers communicate, here are my thoughts. So instead of packaging all of these up into one big image, I would create a smaller docker image just for data collection/calculation in this case. So this container would just maintain a connection to a data feed, and would have the data analysis libraries (eg. pandas, indicator libraries, machine learning, whatever), and be slimmed down and customized to only know how to do the work it needs to do. Then a separate small image could just contain the libraries and logic for connecting to a broker and executing trades. To communicate, there would just be a message queue. In a previous job, we used Amazon SQS. So the data collection / calculation container you discuss would say "oh I have a signal, let me tell the trading container", then would send a message to SQS (or other system, you could even write the records to a database and make a simple queue that way). The other container that only knows how to execute trades would run a process that polls the queue for new signal messages. You could even run multiple processes and make sure they are up and running using supervisord if you are going to be publishing tons of messages and need a bunch of workers doing various things in parallel. So when the workers consume/receive the messages, they do their work, then go right back to checking the queue for any new signal messages that were received. You also might run PostgreSQL on its own and allow both containers to connect to it if both systems need to store data/results.
@@parttimelarry I am not very familiar with Postgres (yet) however coincidently I was reading about Postgres "listen" and "notify" feature. I see that psycopg has async notifications feature. Instead of having extra services in between, wouldn't be faster and easier to maintain if we have it part of the app? If so, an actual walkthrough would be nice.
Seeing you use vscode and GitHub, you might want to look at the support for working with Dev containers (i.e. setting up a .devcontainers folder). More details here code.visualstudio.com/docs/remote/containers. It works with GitHub codespaces and for local development. You just open the folder in vscode and it starts the containers.
Hi Larry. This is a pretty cool thing you put together. I have a general question on the architecture of this whole thing. Supposed I write an app that collects that data from a specific exchange via websocket. Then I would dockerize it, compose via yml and let it collect and store the data. Then another container will be working this data. Would this be a proper set up (i.e., splitting data collection and data usage in two separate containers)? My understanding is that if I don't split it in two containers, every time I change my "data usage" and re-build it, I will need to reset the entire database and all my previously collected data will be gone. Is that right? What is the best way to handle it?
Hi larry,
Have you thought about using CCXT package? might be a good suggestion that could potentially replace some of the packages that you use.
cheers
Yes I have seen this and been meaning to try out, I agree there are some great live trading open source packages that would probably save me from reinventing the wheel :). Thanks, there are a few extra packages I have been meaning to add to this.
@@parttimelarry Cool, will be looking forward!
Amazing video!! A must watch... No doubt about it
subcribed this channel for one month. Past episodes YT suggests everyday still impress me. Local quant classes charge more than $200 fora 4-hour tutorial. None of these lecturers know docker/container.
Thank you, glad you appreciate it. I love comments like this
Looks amazing mate! I used to use jupyterlab a lot but only for setting up a pandas dataframe and manipulating it, it provides an easier way to visualize tables than the console of an IDE. Have never and would never use it for running or deploying an app, it's like you described just for research purposes mainly.
Building a data loader utility as part of the TradeKit CLI would be super dope. See the QuantRocket docs for inspiration.
Checking our QuantRocket, assuming other companies have already worked on this problem, so want to see how everyone approached it.
Would love to hear your thoughts on how to integrate dash applications within fastapi routes. And also if jinja templates can be used with dash.
At minute 44 you mention that it might be too much, perhaps you could have a "lean-with-larry" branch and a "just-the-basics" branch or something like that. I know from my perspective, we aren't all experienced enough to know what we might want or not want, so more is better. I can't thank you enough for building this and creating this video, it was a missing link for me that I've been wanting to learn and explore but wasn't prepared for the million different rabbit holes that I would have to go down, here you just gave us one giant rabbit hole.
As a side note, I notice you don't use Anaconda, but I find that a lot of newer python folks, self-included, were taught to start with anaconda, and so have grown accustomed to working with conda environments, etc. If I wanted to keep with that tradition, would I just fork your repo, then somehow add anaconda or anaconda lite to this container? (wouldn't quite know how to do that, of course) but, If so, then could I stay up to date by setting your repo as the upstream and pulling in changes to my remote (my github fork), then down to my local instance (the container), while in the container (assuming git is installed in the container)? I assume rebuilding the container every time would be massively inefficient. BTW, I may have butchered the terminology there.
Kudos to you for the approach taken. I hate that I am just getting to this video because I have been going through the videos to build out the Trading Bot and thought I would comb through and see where all of this is going and ran into this video. My feedback is WOW. I think you are definitely on to something and I want to share what I think picks up where you left off when it comes to the Database Structure. I call it the "Header Washer". The idea is to bring in any CSV or xls file with different header names and have it update the Header Names to work with the code. There would be a Master Header Alias to make the updates automatically in the background. I have already written the code in VBA to work with Excel. It would not be too challenging to convert it to Python and I would be happy to share it with you if you think it has value to your vision. Thanks for your contribution. Example of my code is lets say 5 people had a Contacts Table. Each of them had a 1st name column and a Last Name column. The 1st name column you have in your code is First_Name. The 1st name column the 5 clients have are FName, F_Name, FirstName, FN, and first name. Likewise with the 2nd Name there are 5 different ways to express the last name. The Master Alias would look for all of these different alias of each column and update it ultimately to First_Name in the Table that is used in the code. This way our code is never broken by unmapped data no matter what order the column is in the table. What if First_Name in the import table is in index 5 and not index 0, the code would automatically identify this and place that column of data into the proper index. It's a huge time saver for working with clean table data that may still need a bit of cleaning up before this is done for Nulls etc. but you would have a strong head start on getting things up and running from someone who is fresh to the Docker Image and wanted to engage with the content with there content. Kudos again and let me know where you are with this. I am also curious about the effort that went into the Trading Bot series you finished in Oct/Nov, did you scrap it and do something else? Did you build upon it. I am trying to find where the end of this goes and what the results are. I am thinking that you should be feeding the homeless and passing our free dinners to people all over the World by now. That's what I plan on doing, Amen. P.S. Is having a guitar in the background a RUclips backdrop thing, or do you really play?, lol Just curious. P.S.2 Have you given Prox Mox a look? It appears to take the container concept to a whole new level in my opinion. But, I am not a Ninja like you are on what that could offer and would be interested in what you thought about it.
Yes please slim it down. You mentioned trading options with Tradier, so maybe an example with "maybe" Octopy with for example in-out credit/debit spreads. Also basic install instructions for those new to linux. This looks like a very big project, so slimmed down will speed everything up
22:57 - IB Feedback: From my research, ib-insync is the way to go I believe. Unfortunately, IB requires you to use a Java applet to authenticate any API-based trading. Headless authentication isn't allowed so you have to have Trader Work Station or IB Gateway running in the background. The Gateway is the more lightweight option (memory-wise) since it exists simply to authenticate and faciliate API-based trading whereas TWS is a full desktop app with a GUI so people can click and trade.
The second downside to IB is that the Gateway/TWS will intentionally log you off daily (I think it's daily, but possibly weekly). This is another so-called security mechanism. However, the upside to ib-insync is that it utilises IBC (IB Controller) which automates logging in to TWS/IBGW whenever you're logged out. Right now I'm looking at existing IBGW Docker images that sort this out. I've linked the two that look best to me below.
All this is a bit of a pain but worth it for the low fees, high liquidity, international access, and access to a wide variety of markets.
Resources:
- ib-insync documentation: ib-insync.readthedocs.io/api.html
- Check out 'Notebooks' and 'Recipes' for some good examples of how simple the code can be.
- User group with lots of helpful info: groups.io/g/insync/
- IBC: github.com/IbcAlpha/IBC
- This has superseded the old 'IBController'.
- IBC is packaged with ib-insync. ib-insync docu on using IBC: ib-insync.readthedocs.io/api.html#ibc
- IB Gateway Docker Containers
- github.com/dvasdekis/ib-gateway-docker-gcp
- github.com/manhinhang/ib-gateway-docker
Thanks for this!
Since you like to build apps, how about a little GUI or CLI application that lets the user build their custom image of tradekit by selecting checkboxes of the tools and libs they'd like to be part of the final image?
Are you going to put all in one docker image or going to modularize it in various containers and create a docker-compose file? If possible can you also let us know how this can be deployed to platform like Jelastic Cloud?
How about show how to build a docker image, so we can build our own
Yes on the stock data schema which would work nicely with a data downloader
Thanks, need to finalize what the schema will look like.
What about security? I'd be careful with using API keys of any brokers (that have access to my real money accounts) in some Docker image with a bunch of dev tools and libraries? Also I would not advise to use such an image for a deployment into production - too big attack surface. I hope you will provide a warning when you publish this image on Docker Hub.
Other than that I really enjoyed all the videos from you I watch so far. Thank you!
Apple Silicon users replace ./configure with this one ./configure --build=aarch64-unknown-linux-gnu in Dockerfile
You can dump the Flask ad Jupyter from the image. Are you going to use NGINX to get the webserver on a routable IP address so this will be available from the web?
Awesome! Could you add H2O ML to the libaries?
Larry, this looks awesome! The more plug and play you make it the better, so that everyone can focus on strategies. Maybe keep a slimmed down version separate and base it on feedback plus your opinion. What is the installed size of this?
Agreed, I have realized the ML libraries are quite large and most people probably won't even use them. Will see about splitting this up. Thanks for the feedback.
Wow man. God level prep.
I love this idea. Reminds me of QuantRocket
it is perfect thanks a lot..
requirements file does not contain the versions of libraries, will it cause potential issues for the future?
Hello, hopefully it was clear I am collecting feedback at the moment. Will be adding some packages, freezing versions, and writing some docs. Will be updating requirements.txt
You are the king!
Do you know how to embeed a trading view chart in dash? And add línes ?
This is freakin awesome!!
Hey Larry, you could include ccxt.
Larry, after run docker, my timescale installing has error , is anything missing?
Love you 4 this nice pieces... It's a shame that I didn't find this TradeKit before I tried to install TA-Lib :/
Please make a video on price action (higher high,higher low and lower high,lower low, trendline etc.) based python algo.
do I need big storage to install it? thanks
Hi there, would you be open to adding Coinbase pro as one of the brokers? I am not a fan of the fees that Binance charges for withdrawals.
Epic App! Great Work...!
Windows users should be using WSL2, not native Windows environment, for nodejs or python development. This gives Windows user a Linux environment specially integrated with the Windows OS. Even the official Microsoft docs recommend this configuration. Yes, you can use only native Windows environment, but many times you’ll hit compatibility issues with packages. This docker solution is really good though too. Vscode can run remote against wsl2 or Docker or ssh.
Thanks for the feedback. Still re-familiarizing myself with the latest in Windows since I haven't used it as my primary OS in a while. Docker required me to set up the Linux subsystem, but it's been a while since I have used it. I should use my Surface Book more :).
@@parttimelarry Hi Larry. Yes I'm getting an error at step 10 / 10 when it start collecting and downloading scipy. Some sort of exception error. Any ideas how I address this?
Nice job. Are you still open to contributions? I can add in some backtrader strategies.
I'm not sure if machine learning packages inside container would make sense. I think accessing GPU inside a docker container is a tough job
PRETTY SWEET
nice Docker Image....but only the first example link works for me... the rest get this error on clicking the link: {"detail":"Not Found"}
well thought out, good preparation, great execution.
as u say, prob too many libs now? but really depends on the purpose, if next a "project" maybe less, if next "loose examples" then more now is ok? but I guess as the old saying goes, at the end of the day ... less is more!
imho couple locks in some technology is a must, on postgres, on debian, sqlite, redis, talib, etc.
if u ever consider UI into mobile too, imho consider a peek at Flutter.
ps I like using pandas-ta, but it really is "just" another flavor of talib.
congratz Larry
Thanks for the feedback here, agree with your points on locking the versions and possibly being a bit more selective and having different requirements.txt files for different "flavors". From the feedback I received so far, including all of the ML stuff slows things down quite a bit since those packages are huge. Including some lightweight indicator and backtesting libraries doesn't add much bloat though. Also could be good to just have a different config for each broker.
Great tutorial! Thank you for all you do. I have a question for you or the community, though, as I am a bit stuck. I was able to get Debian running in Docker, but when I enter psql -U postgres, I get this error: "psql: error: FATAL: role 'postgres' does not exist" I made sure postgres was installed and that postgres was the default user, but I continue to get the same error. Anyone have any ideas as to what I might be doing wrong?
You are my personal Jesus. Thanks a lot!!!!
@parttimelarry I hope it is not too much trouble for you to update the TradeKit collection - it has been almost 2 years since it was released
Many Thanks in advance :)
really awesome
i really get the idea
but profits matter if we get profits that would be blessed
How about links to relevant articles, educational pages, etc. that you may reference in your videos?
Agreed, I think it would be cool to build out the landing page UI and the Github README and have it not only provide a set of interactive examples that can be edited and run, but also linked up to additional resources and documentation. And then over time it becomes a really cool starter kit for learning and developing new things collaboratively.
@@parttimelarry You've mentioned creating a course, so maybe build out a course for the Docker content, contained in a folder of HTML pages in the Docker image - then everything is "packaged to go." I would gladly pay for something like that.
Hi Larry, Your channel and videos are great its awesome. Can i rent you for a couple of weeks in order to have my strategy runnning in the cloud
I am not for rent, I am too expensive :)
Have you thought about adding Broker API Support for TradeStation10?
I haven't thought about this yet, mostly because I'm not so familiar with TradeStation (although I have read a book that uses EasyLanguage). Doesn't TradeStation mostly assume you are going all-in on their platform rather than creating your own tools/infrastructure? I don't really know the platform yet.
@@parttimelarry Well, I haven't really dug very deep to be honest. But, I know they are integrated with TradingView (which I imagine is by API) and there is an API available => www.tradestation.com/platforms-and-tools/web-api/
@@mattsterba Awesome, thanks for the link. I have been gradually learning about what platforms everyone uses to trade, and TradeStation does come up a lot, so I need to learn more about it. Hopefully I can provide some value here as well!
Mcallen!! Love it, have the same bottle at home 😁
Haha Cheers and Happy New Year! Drinking it right now :)
nice
Selenium (headless) could be useful. I use Selenium to log on to websites and pull data. Or is there a better option?
Good idea! I think this will be handy for some API's as well.
@@parttimelarry There is even a more lightweight package for this called pyppeteer.
Nice
top man
Hi friend , the mic isnt the best. There is feedback :(
Do you (or anyone) know how (or if it's even possible) I could do something like the following decorator that I created to wrap asyncpg DB fetches with a connection from the pool and a transaction block? I'm working on a DB helper class and don't want to have to keep writing with statements for every function. I'd like to end up with an abstract function that accepts "sqlStatment" as an argument and then that function be decorated with the "@_connectionTransaction" that I've created to try and abstract away the pool/transaction and keep things DRY. Any help would be much appreciated
Full helper class if it's any use in finding an answer (or helping others)...
import sys, asyncpg
from pandas import DataFrame
from config import DB_LOGGING_REF
from .definitionsAndQueries import queryStockSymbolsAndIds, dataFrameColumnsStockSymbolsAndIds
class AsyncTimescaleDB:
"""Asynchronous TimescaleDB (PostgreSQL extension) class"""
def __init__(self, DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME, LOG):
self._host = DB_HOST
self._username = DB_USERNAME
self._password = DB_PASSWORD
self._port = DB_PORT
self._dbName = DB_NAME
self._LOG = LOG
self._connectionPool = None
async def _init_async(self):
"""Asynchronous function initializations for this class"""
await self._createConnectionPool()
async def _createConnectionPool(self):
"""Creates a pool of database connections for asynchronous usage"""
if self._connectionPool is None:
try:
self._connectionPool = await asyncpg.create_pool(
host=self._host,
port=self._port,
user=self._username,
password=self._password,
database=self._dbName,
command_timeout=120
)
except Exception as error:
self._LOG.fatal('Problem creating database connection pool: ', error)
sys.exit(1)
self._LOG.info(f"Successfully created database connection pool: {DB_LOGGING_REF}")
def _connectionTransaction(self, func):
"""Retrieves a database connection from the pool and establishes a transaction block"""
try:
async def _connectionTransactionWrapper(self, *args, **kwargs):
async with self._connectionPool.acquire() as _connection:
async with _connection.transaction():
# Database interaction function that we wrap with a connection and transaction
return await func(_connection, *args, **kwargs)
return _connectionTransactionWrapper
except Exception:
self._LOG.error('Problem with connection pool or transaction layer', exc_info=True)
@_connectionTransaction
async def stockSymbolsAndIds(self, _connection):
"""Fetches stock symbols and IDs (linking to historical data table) from database"""
try:
resultsRecords = await _connection.fetch(queryStockSymbolsAndIds)
return DataFrame(resultsRecords, columns=dataFrameColumnsStockSymbolsAndIds)
except Exception:
self._LOG.error('Problem fetching stock symbols and IDs: ', exc_info=True)
Figured it:
class AsyncTimescaleDB:
"""Asynchronous TimescaleDB (PostgreSQL extension) class"""
def __init__(self, DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME, LOG):
self._host = DB_HOST
self._username = DB_USERNAME
self._password = DB_PASSWORD
self._port = DB_PORT
self._dbName = DB_NAME
self._LOG = LOG
self._connectionPool = None
async def _initAsync(self):
"""Asynchronous function initializations for this class"""
await self._createConnectionPool()
async def _createConnectionPool(self):
"""Creates a pool of database connections for asynchronous usage"""
if self._connectionPool is None:
try:
self._connectionPool = await asyncpg.create_pool(
host=self._host,
port=self._port,
user=self._username,
password=self._password,
database=self._dbName,
command_timeout=300
)
except Exception as error:
self._LOG.fatal('Problem creating database connection pool: ', error)
sys.exit(1)
self._LOG.info(f"Successfully created database connection pool: {DB_LOGGING_REF}")
def _connectionTransaction(functionBeingWrapped):
"""Wraps function with pool connection and transaction block"""
async def _connectionTransactionWrapper(self, *args):
try:
async with self._connectionPool.acquire() as _connection:
async with _connection.transaction():
# Database interaction function that we wrap with a connection and transaction
return await functionBeingWrapped(self, _connection, *args)
except Exception:
self._LOG.error('Problem with connection pool or transaction: ', exc_info=True)
return _connectionTransactionWrapper
async def _abstractFetchAsDataFrame(self, _connection, sqlStatement, dataFrameColumns):
"""Fetches values from database based on the SQL statement argument"""
try:
resultsRecords = await _connection.fetch(sqlStatement)
# Convert results to dataframe
return DataFrame(resultsRecords, columns=dataFrameColumns)
except Exception:
self._LOG.error('Problem fetching values: ', exc_info=True)
@_connectionTransaction
async def stockSymbolsAndIds(self, _connection):
"""Fetches stock symbols and IDs (linking to historical data table) from database"""
return await self._abstractFetchAsDataFrame(
_connection, queryStockSymbolsAndIds, dataFrameColumnsStockSymbolsAndIds
)
Thanks for your posts here, still need to catch up :). Will try out some of your techniques soon since I am trying out TimeScaleDB this week and your async code will come in handy.
@@parttimelarry No worries! Don't forget the class will need instantiating differently due to the async method that needs to happen at the initialisation stage. I've linked an example below. Also, you should consider shifting from requests to aiohttp, they work very similar but you won't get the blocking as with requests.
# main.py
# -----------------------------------
import yaml, aiohttp, asyncio, logging.config
from database.AsyncDatabase import AsyncDatabase
from client.AsyncHTTP import AsyncHTTP
from database.StockData import StockData
from config import APP_NAME, LOG_CONFIG_PATH
from config import DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME
async def main():
asyncDatabase, LOG = await _init()
async with aiohttp.ClientSession() as clientSession:
# Initialize asynchronous HTTP client and data classes
asyncHTTP = AsyncHTTP(clientSession, LOG)
stockData = StockData(asyncDatabase, asyncHTTP, 24, '1min', LOG)
# Obtain all data we are interested in
await stockData.populateStocks()
# await stockData.populateHistoricalStockPriceData()
await _destroy(asyncDatabase, LOG)
async def _init():
# Parse the logging configuration file (run once here, application singleton)
with open(LOG_CONFIG_PATH) as f:
logConfig = yaml.load(f, Loader=yaml.FullLoader)
# Load the parsed config
logging.config.dictConfig(logConfig)
LOG = logging.getLogger("app")
LOG.info(f"{APP_NAME}: Application started")
# Initialise asyncDatabase
asyncDatabase = await _createAsyncDatabase(LOG)
return asyncDatabase, LOG
async def _createAsyncDatabase(LOG):
asyncDatabase = AsyncDatabase(DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME, LOG)
# Asynchronous class function initialization
await asyncDatabase._initAsync()
return asyncDatabase
async def _destroy(asyncDatabase, LOG):
await asyncDatabase._destroy()
LOG.info(f"{APP_NAME}: Application exited successfully")
# Application entry point
if __name__ == "__main__":
asyncio.get_event_loop().run_until_complete(main())
# AsyncDatabase.py
# ------------------------------------------
import asyncpg
from pandas import DataFrame
from config import DB_LOGGING_REF
from .definitionsAndQueries import (
queryCreateStocksTable, queryCreateHistoricalStockPricesAndIndicatorsTable,
queryStockSymbolsAndIds, dataFrameColumnsStockSymbolsAndIds,
queryCreateTimeScaleDbExtentsion, queryConvertHistoricalToHyperTable,
queryInsertStocks
)
class AsyncDatabase:
"""Asynchronous TimescaleDB (PostgreSQL extension) class"""
def __init__(self, DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME, LOG):
self._host = DB_HOST
self._username = DB_USERNAME
self._password = DB_PASSWORD
self._port = DB_PORT
self._dbName = DB_NAME
self.LOG = LOG
self._connectionPool = None
async def _initAsync(self):
"""Asynchronous function initializations for this class"""
await self._createConnectionPool()
await self._createAllTablesIfNotExist()
async def _createConnectionPool(self):
"""Creates a pool of database connections for asynchronous usage"""
if self._connectionPool is None:
try:
self._connectionPool = await asyncpg.create_pool(
host=self._host,
port=self._port,
user=self._username,
password=self._password,
database=self._dbName,
command_timeout=300
)
except Exception:
self.LOG.exception("Failed in creating database connection pool")
else:
self.LOG.info(f"Successfully created database connection pool: {DB_LOGGING_REF}")
async def _destroy(self):
"""Class destruction process"""
if self._connectionPool is not None:
await self._connectionPool.close()
def _connectionTransaction(functionBeingWrapped):
"""Wraps function with pool connection and transaction block"""
async def _connectionTransactionWrapper(self, *args):
try:
async with self._connectionPool.acquire() as _connection:
async with _connection.transaction():
# Database interaction function that we wrap with a connection and transaction
return await functionBeingWrapped(self, _connection, *args)
except Exception:
self.LOG.exception('Connection pool or transaction error: ')
return _connectionTransactionWrapper
async def _abstractExecute(self, _connection, SQL):
"""Executes an SQL command (or commands)"""
try:
await _connection.execute(SQL)
except Exception:
self.LOG.exception(f"{SQL}
")
async def _abstractFetchAsDataFrame(self, _connection, SQL, dataFrameColumns):
"""Fetches values from database based on the SQL statement argument"""
try:
resultsRecords = await _connection.fetch(SQL)
except Exception:
self.LOG.exception(f"\"{SQL}\"
")
else:
# Convert results from records to dataframe
return DataFrame(resultsRecords, columns=dataFrameColumns)
@staticmethod
def _SQLStringConcatenator(stringList):
"""Combines SQL multiple strings to form one execute statement with ';' statement terminators"""
return ";
".join(stringList) + ";"
@_connectionTransaction
async def _createAllTablesIfNotExist(self, _connection):
"""Creates all required tables within the database (if they do not already exist)"""
try:
executeString = self._SQLStringConcatenator([
queryCreateStocksTable,queryCreateHistoricalStockPricesAndIndicatorsTable,
queryCreateTimeScaleDbExtentsion, queryConvertHistoricalToHyperTable
])
await self._abstractExecute( _connection, executeString)
except Exception:
self.LOG.exception("Failed in creating all database tables")
else:
self.LOG.info("Successfully created all database tables")
@_connectionTransaction
async def getStockSymbolsAndIds(self, _connection):
"""Fetches stock symbols and IDs (linking to historical data table) from database"""
return await self._abstractFetchAsDataFrame(
_connection, queryStockSymbolsAndIds, dataFrameColumnsStockSymbolsAndIds
)
Hey man, can i contact you somewhere?
I got an 800 line trading view pinescript which i wanna use to trade on the calls it makes via webhook urls, but i dont know anything about coding and i watched your videos and they didnt really get me anywhere so i dont know what to change in the script to make it work.
Wanna help me out for 50 bucks or something?