-
Notifications
You must be signed in to change notification settings - Fork 119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Streaming response continuously? #58
Comments
We've talked about adding streaming api support to both http/1.1 and http/2 pools. It just hasn't been a priority yet. We're not against the idea though. Is the goal to open a connection to a url and continually stream responses to the calling process? Or do you need something more advanced than that? |
In my case it would be exactly the first scenario - continuously receiving an unbound stream of updates. |
What do you have in mind Chris when you mention something more advanced? Just curious about your ideas. I want to add something, and I'll keep the example of brokers for Forex exchanges or crypto. Some send market price updates by continuously streaming HTTP responses on a connection, as described in this issue; but other brokers choose to send those price updates through WebSocket protocol. Would that also be the job for an HTTP client such as Finch? I think so; but if not, why and what tool do I need to use to get those price updates through a WebSocket? |
@thojanssens If you need bi-directional communication that is guaranteed to be on the same connection, you can't do that with http/1.1. It's only possible on http/2. Along with that limitation, our current pooling/load balancing strategy doesn't give you control over which connection your request uses. For instance, if you start 4 http/2 connections and you begin streaming a response on one connection, if you create a new request that request can end up on any of the 4 connections. I don't think the use case I just described is something that we're going to attempt to solve in Finch. If you need to have that much control you're better off using Mint directly. But I think Finch can provide mechanisms for streaming responses. We'll need to provide a caveat for http/1.1 pools that if you start streaming responses with limited pool sizes you may end up exhausting your pool. But anyone who needs streaming in that way is most likely aware of the tradeoff. |
Streaming support has been merged with #59. If y'all would like to try it out that would be great. I'm going to close this issue for now. If there are specific issues that come up please let us know. |
… Reporting (sneako#58) * Add GoogleErrorReporter to format errors compatible with Google Error Reporting * :undef error contains an extra line of context * mix format * Add optional metadata * Automatically format errors for google when google_error_reporter config is present * Revert "Automatically format errors for google when google_error_reporter config is present" This reverts commit 472e16ca0f755e7fa3cc75b8617e0b74289e285b. * Add google_error_reporter config * Handle case where stack trace includes arg data * Reformat Elixir's stacktrace format Use regular expressions to reformat the lines, instead of building the stacktraces manually. This ensures that the stacktrace includes all of the original information. The previous approach could throw away lines it didn't understand. * Format the message name Google skips over this with Elixir's default format * Specify the reason This allows for pass-through from try/catch * Revert "Format the message name" This reverts commit 99ffdce64d81528ab15815cc7158f6fae92fe19a. * Add a Context section after the stacktrace Google Error Reporter names the error message after last non-file line in the message appearing before the first file line. This leads to errors being named "Foo.Bar(123, 456)" rather than "(UndefinedFunctionError) function Foo.bar/2 is undefined" Putting the context at the end keeps the original error message name, and still includes the contextual information.
Imagine connecting to a forex/crypto trader that streams the response continuously with real-time price changes.
I can achieve that with Mint easily through their API (see sample code below); we can easily stream and read from the response forever. But it seems that it is not possible with Flinch?
I stumbled upon the same demand for Tesla:
elixir-tesla/tesla#271
but open since 2018
Problem with mint is that I don't have a connection pooling (there's mint_pool in development but stopped for now) and using something higher-level such as flinch would be easier to work with.
The text was updated successfully, but these errors were encountered: