Standards and Easy integration are probably the most important. For example, if an API is going to accept a standardized data format (like JSON), FOLLOW THE SPEC!! (Authorize Net, I'm looking at you!!)
I'm looking forward to see more videos about the design. Like general concepts. I learn all programming by myself and after a short period of time I realise, that I'm developing a Frankenstein that's gonna eat me in a while. I build mess 😂
Throughout the years I have found that the use of any pattern/technique/solution/method/etc., is using it in the right context. That is, use common sense, that is what people lack (including coders and programmers!) You don't have to use a bulldozer to dig a 1 meter deep hole, you can use a shovel and it will work like a charm. But if you expect do dig a foundation with a shovel it will take forever! Use the right tool for the right context!
@ArjanCodes Thanks for the video. I have two questions: 1. What do you think about swagger for api documentation? 2. How do you like to use POST instead of GET in case of multiple search filters?
The time/date standards are complicated but are handled in the base library in Python plus additional packages (probably the same for js). There are a lot of transactions so requiring seconds is probably a performance issue.
Anyone watching this video, ignore all of Arjan's comments on Paypal's use of ISO-8601 datetime format. That is a *financial* and *developer centric* API, so it *requires* arguments to be *as accurate as possible*. The use of the ISO format: - Guarantees an international standard - Guarantees easy datetime parsing and construction in almost all modern programming languages (python is one of the few oddballs that didn't follow international standards on datetime strings for some daft reason) - Guarantees cross time zone accuracy While Arjan is correct in saying arguments in APIs should be simple and easy to use, this is one of the worst examples to give! Datetimes in financial transactions are just one of those things where accuracy is a pivotal Functional Requirement
I agree with you that date and time arguments should be accurate. I also don’t advocate for not using a standard for dates/times. The point I wanted to bring across in the video is that a) an API should have sensible defaults such as using the current date/time for the end time when retrieving transactions, and b) that API documentation should give clear examples of how to use the endpoints. Since dealing with dates and times can be complex, this is even more important for endpoints that use them.
That's understandable! However, from experience developing within banks, there are no such things as "sensible defaults" in financial transactions, especially in APIs designed for hundreds or thousands of payments per minute :) You see similar in APIs for "safety critical" processes. While it is more "intuitive" to do things like round to the nearest minute/hour, there can be a lot on the line for mistakes that can happen when automated processes multiply micro-errors into macro-errors!
Thanks for sharing! I don’t have experience working within a bank, so that’s certainly a different ballgame. Defaults and good documentation/examples are still helpful though if you’re a small business and you’re looking to integrate with a payment provider without having to get a degree in finance!
I agree. There can't be a sensible default for datetime ranges if you work with multiple timezones. If a client requests for the transactions of "today", you need to specify which timezone's day you're talking about. If your server has UTC time but your customer is at UTC-6, when they say "today" they could both be different calendar days. So a transaction that might have happened at 5:30 UTC on July 21, for clients at UTC-6 that transaction happened on July 20 at 23:30, which is a different calendar day to them. That's why those API's require setting start/end dates with ISO-8601 which accounts for timezones. If the server had a default for "today" being from 0:00-23:59 UTC, if the customer is not using UTC, they will see records from other calendar days.
It would be interesting if you created a video or course about structuring a fastapi application. E.g, how would you structure the endpoints (model-focused? domain focused?), how would you handle business logic?
🔥 These 6 tips are GOLD for anyone building REST APIs! 💻 I’ve tried a few and they’ve drastically improved my API design. Can't wait to apply the rest! 😍 If you're into backend development or just starting out, this video is a MUST-WATCH! 🚀 #API #Backend #RESTAPI
Quickly becoming the best coding channel. $10 for pro is too little I think its worth more than that. You should add supporter their for more than 10 I would get it in a heartbeat the value is already there.
re: custom data, off the top of my head similar to video's approach with some dfiferences: Use postgres json data type to store actual json (the custom data dictionary). That's already one step better than string storage with parsing. Then you can choose whether to do mutation logic in postgres itself with custom functions, or do that in the python layer. Either way, you need something that can merge json without dumb overwrite behavior, and avoid putting that chore on the API user's back. (As you mentioned.) Probably the easier/cleaner path at first to due all the mutation stuff in python and use postgres purely for storage with vanilla IO. Hopefully you can leverage python syntax to make this pretty breezy. i.e. custom_data = getdata(id) new_custom_data = req_body["newdata"] merge_data = custom_data.update(new_custom_data) #
I see a lot of sites providing a panel with example for various languages (cURL, Python, JS, ...) as input and corresponding result for every endpoints. Is there a way to generate this panels automatically?
Hi Arjan, your videos are as always precise and insightful. I have a question: I am currently doing a project that requires at the end of the execution to push 20-25K rows of data mssql server (on premise). I used sql alchemy, was bit slower. later I used pyodbc, bit faster. Now approx. it takes 3 minutes to push 4k rows. My data has 20 columns, including int, varchar, and bit types. Would you suggest something, that could make the pushing faster, maybe each time with a chunks of 100-150 rows?
10:21 I can reassure you that being in the field doesn't help with understanding what this means... Because I can see fields related to corporate bonds, but also to foreign exchange. So the API is mixing asset classes even though the name suggests it should only contain currencies. Complete nightmare to work with!
I can see why PayPal API would be intentionally designed to force use of fully granular arguments re: datetime (including seconds and timezone). It's an interesting pivot point here between "keep it simple" and "use obvious default behavior" vs safeguard the user against dumb logic errors especially where there are possible gotchas. (How you handle datetimes is a classic source of headache.) But this actually serves your broader point -- it should ALWAYS be as simple as possible. There should be a good reason why it's not as simple as you might think at first. And that reasoning should be explained well, and made obvious through example code. AKA "as simple as possible" might not mean "as easy as the novice thinks of it", and the required complexity then needs to be surfaced directly.
Hmmm.. I didn't get the part with adding ids of related object. Isn't that obvious that You should provide only needed data (it means that related objects will be usefull only for chosen use cases) ? And then, when it's needed You can always return not only ids but also aggregate (more detailed information about related objects) . Input and output serializers (or input and output dto's) are also next big thing You missed. I liked that you showed real API's examples.
While that all nice and dandy, I have a felling that some basic CRUD must be "out-of-the-box" experience.. like plug in your custom functions here and there and get an API (is there a package like that? gotta be somewhere..), at least for CRUD on objects that customer "owns".
The fact that you write applications that don't care about proper representation of time is just an admission that you don't actually work in the real world. Using anything other than a proper standard representation of time in a api that is used globally would be crazy. Somtimes things are not as simple as it is with your hobby projects.
You’re missing the point. I don’t advocate for not using a standard for dates/times. The point I wanted to bring across in the video is that a) an API should have sensible defaults such as using the current date/time for the end time when retrieving transactions, and b) that API documentation should give clear examples of how to use the endpoints. Since dealing with dates and times can be complex, this is even more important for endpoints that use them.
An SDK is typically a library whereas documentation is often a website. For example, Stripe has a Python SDK, this is a package you install. It also has a website with documentation.
PayPal documents that it has a delay of up to three hours between something being entered and appearing in the search results. You don’t mention the other handlers. Do they explicitly say the response is instant (or has some shorter period)? Or are you assuming that their lack of statement means that it is immediate? Have you tested them? Are you truly this naïve?
I like your channel, the content and your preesentation and talking. Its a lovely channel. But the pictures before you click the video, are terrible. Really really bad. Please improve that, a litle bit more classy ;)
💡 Get my FREE 7-step guide to help you consistently design great software: arjancodes.com/designguide.
Standards and Easy integration are probably the most important. For example, if an API is going to accept a standardized data format (like JSON), FOLLOW THE SPEC!! (Authorize Net, I'm looking at you!!)
Hey, what do you mean by "spec" in this context? Do you mean JSON spec so serialising/de-serialising is bug free?
This video came at just the right moment. Preparing some API s for a system, these tips will be implemented immediately. Thanks Arjan.
You’re welcome!
I'm looking forward to see more videos about the design. Like general concepts. I learn all programming by myself and after a short period of time I realise, that I'm developing a Frankenstein that's gonna eat me in a while. I build mess 😂
More to come!
Throughout the years I have found that the use of any pattern/technique/solution/method/etc., is using it in the right context. That is, use common sense, that is what people lack (including coders and programmers!)
You don't have to use a bulldozer to dig a 1 meter deep hole, you can use a shovel and it will work like a charm. But if you expect do dig a foundation with a shovel it will take forever! Use the right tool for the right context!
APIRoast 🔥
You’re a legend! Pointing out which APIs are great vs. sht is so helpful
Glad you find it helpful!
@ArjanCodes Thanks for the video. I have two questions: 1. What do you think about swagger for api documentation?
2. How do you like to use POST instead of GET in case of multiple search filters?
We need a video to learn how to develop the SDK, please
Thanks! This is was great coverage of the subject.
Hi Robert, thank you so much, glad you enjoyed it!
The time/date standards are complicated but are handled in the base library in Python plus additional packages (probably the same for js). There are a lot of transactions so requiring seconds is probably a performance issue.
Anyone watching this video, ignore all of Arjan's comments on Paypal's use of ISO-8601 datetime format. That is a *financial* and *developer centric* API, so it *requires* arguments to be *as accurate as possible*. The use of the ISO format:
- Guarantees an international standard
- Guarantees easy datetime parsing and construction in almost all modern programming languages (python is one of the few oddballs that didn't follow international standards on datetime strings for some daft reason)
- Guarantees cross time zone accuracy
While Arjan is correct in saying arguments in APIs should be simple and easy to use, this is one of the worst examples to give! Datetimes in financial transactions are just one of those things where accuracy is a pivotal Functional Requirement
I agree with you that date and time arguments should be accurate. I also don’t advocate for not using a standard for dates/times. The point I wanted to bring across in the video is that a) an API should have sensible defaults such as using the current date/time for the end time when retrieving transactions, and b) that API documentation should give clear examples of how to use the endpoints. Since dealing with dates and times can be complex, this is even more important for endpoints that use them.
That's understandable! However, from experience developing within banks, there are no such things as "sensible defaults" in financial transactions, especially in APIs designed for hundreds or thousands of payments per minute :)
You see similar in APIs for "safety critical" processes. While it is more "intuitive" to do things like round to the nearest minute/hour, there can be a lot on the line for mistakes that can happen when automated processes multiply micro-errors into macro-errors!
Thanks for sharing! I don’t have experience working within a bank, so that’s certainly a different ballgame. Defaults and good documentation/examples are still helpful though if you’re a small business and you’re looking to integrate with a payment provider without having to get a degree in finance!
THANK you. I was hoping someone else would say it.
I agree. There can't be a sensible default for datetime ranges if you work with multiple timezones. If a client requests for the transactions of "today", you need to specify which timezone's day you're talking about. If your server has UTC time but your customer is at UTC-6, when they say "today" they could both be different calendar days.
So a transaction that might have happened at 5:30 UTC on July 21, for clients at UTC-6 that transaction happened on July 20 at 23:30, which is a different calendar day to them. That's why those API's require setting start/end dates with ISO-8601 which accounts for timezones.
If the server had a default for "today" being from 0:00-23:59 UTC, if the customer is not using UTC, they will see records from other calendar days.
Thanks for the video. I enjoyed it !
Glad you enjoyed it!
It would be interesting if you created a video or course about structuring a fastapi application. E.g, how would you structure the endpoints (model-focused? domain focused?), how would you handle business logic?
🔥 These 6 tips are GOLD for anyone building REST APIs! 💻 I’ve tried a few and they’ve drastically improved my API design. Can't wait to apply the rest! 😍 If you're into backend development or just starting out, this video is a MUST-WATCH! 🚀 #API #Backend #RESTAPI
You read my mind. I had been thinking about finding resources on what a good API should be
Quickly becoming the best coding channel. $10 for pro is too little I think its worth more than that. You should add supporter their for more than 10 I would get it in a heartbeat the value is already there.
Thanks, Arjan!
re: custom data, off the top of my head similar to video's approach with some dfiferences:
Use postgres json data type to store actual json (the custom data dictionary). That's already one step better than string storage with parsing.
Then you can choose whether to do mutation logic in postgres itself with custom functions, or do that in the python layer. Either way, you need something that can merge json without dumb overwrite behavior, and avoid putting that chore on the API user's back. (As you mentioned.)
Probably the easier/cleaner path at first to due all the mutation stuff in python and use postgres purely for storage with vanilla IO. Hopefully you can leverage python syntax to make this pretty breezy. i.e.
custom_data = getdata(id)
new_custom_data = req_body["newdata"]
merge_data = custom_data.update(new_custom_data) #
I see a lot of sites providing a panel with example for various languages (cURL, Python, JS, ...) as input and corresponding result for every endpoints.
Is there a way to generate this panels automatically?
Always looking forward to Friday 17 CET
Glad to be of service 😊
Same same.
Very insightful video!
Glad you found it helpful!
What about making a test framework / sandbox for unit testing your API?
Hi Arjan, your videos are as always precise and insightful.
I have a question: I am currently doing a project that requires at the end of the execution to push 20-25K rows of data mssql server (on premise). I used sql alchemy, was bit slower. later I used pyodbc, bit faster. Now approx. it takes 3 minutes to push 4k rows. My data has 20 columns, including int, varchar, and bit types.
Would you suggest something, that could make the pushing faster, maybe each time with a chunks of 100-150 rows?
Excelente tips!
👏👏👏👏👏
Glad you liked them, Rafael!
10:21 I can reassure you that being in the field doesn't help with understanding what this means...
Because I can see fields related to corporate bonds, but also to foreign exchange. So the API is mixing asset classes even though the name suggests it should only contain currencies. Complete nightmare to work with!
YUP
I can see why PayPal API would be intentionally designed to force use of fully granular arguments re: datetime (including seconds and timezone). It's an interesting pivot point here between "keep it simple" and "use obvious default behavior" vs safeguard the user against dumb logic errors especially where there are possible gotchas. (How you handle datetimes is a classic source of headache.)
But this actually serves your broader point -- it should ALWAYS be as simple as possible. There should be a good reason why it's not as simple as you might think at first. And that reasoning should be explained well, and made obvious through example code. AKA "as simple as possible" might not mean "as easy as the novice thinks of it", and the required complexity then needs to be surfaced directly.
I know private methods isn't a really python thing but I would have used the _ naming convention for methods I'll use only inside the class.
Hmmm.. I didn't get the part with adding ids of related object. Isn't that obvious that You should provide only needed data (it means that related objects will be usefull only for chosen use cases) ? And then, when it's needed You can always return not only ids but also aggregate (more detailed information about related objects) . Input and output serializers (or input and output dto's) are also next big thing You missed. I liked that you showed real API's examples.
Apparently there is `sqlalchemy-json` project that adds mutation-tracked json types.
Good to know, thanks for sharing that!
While that all nice and dandy, I have a felling that some basic CRUD must be "out-of-the-box" experience.. like plug in your custom functions here and there and get an API (is there a package like that? gotta be somewhere..), at least for CRUD on objects that customer "owns".
The fact that you write applications that don't care about proper representation of time is just an admission that you don't actually work in the real world. Using anything other than a proper standard representation of time in a api that is used globally would be crazy. Somtimes things are not as simple as it is with your hobby projects.
You’re missing the point. I don’t advocate for not using a standard for dates/times. The point I wanted to bring across in the video is that a) an API should have sensible defaults such as using the current date/time for the end time when retrieving transactions, and b) that API documentation should give clear examples of how to use the endpoints. Since dealing with dates and times can be complex, this is even more important for endpoints that use them.
I never actually succeeded to do anything in SQLAlchemy from their documentation 😅
Haha, I totally get that.
SQLAlchemy has an atrocious documentation, I tried really hard and could barely make any sense of it
I think you use “Mea culpa” wrong Arjan. Look it up.
Video is good info though, even for when you don’t write Python.
Stupid Question- what is difference between documentation and SDK?
An SDK is typically a library whereas documentation is often a website. For example, Stripe has a Python SDK, this is a package you install. It also has a website with documentation.
This was just an excuse to bash the PayPal API, wasn't it🤣
LOL
PayPal documents that it has a delay of up to three hours between something being entered and appearing in the search results. You don’t mention the other handlers. Do they explicitly say the response is instant (or has some shorter period)? Or are you assuming that their lack of statement means that it is immediate? Have you tested them? Are you truly this naïve?
Great video ! But clearly you didn't design your api to handle a couple of trillion concurrent requests ! What is this ?!? Amateur hour ?
Haha, proud to be an amateur!
I like your channel, the content and your preesentation and talking. Its a lovely channel. But the pictures before you click the video, are terrible. Really really bad. Please improve that, a litle bit more classy ;)
not a great thumbnail
not a great comment
Thanks Arjan!