• by giantg2 on 7/13/2021, 12:19:50 PM

    I think it depends on the details and performance requirements. Otherwise the general sentiment of 'use any of the major technologies, one you're familiar with' stands.

    For example, is this software aimed at the end user or will the startup use it themselves? If the end user will use it (like alerts for their portfolio/watch list), then the definition of "real-time" is going to be lax because you need human interaction. If it's used by an institution for trading, then you will need to choose highly performant language, architecture, and hardware so that your algorithm can place it's orders before other institutions. You would also likely need to integrate with Swift (financial).

  • by heeton on 7/13/2021, 8:05:21 AM

    Any major tech stack will work fine. I assume “real time alerts” will need an integration with a 3rd-party notification provider (email, sms, etc) and the bulk of your business logic will be defined with whoever is providing your share price API.

    So, tech doesn’t really matter, pick something simple and mainstream so you can get help as needed.

    Ruby/Rails, Python/Django, Nodejs, Java, whatever. Postgres for a database.

    You likely won’t need any exotic cool tech, novel databases etc.

  • by lalo2302 on 7/13/2021, 11:42:01 AM

    Take in consideration to not use floats for money. Use a `decimal` data format. I'd also advice against using javascript since it has problems dealing with big numbers. There are libraries of course to handle that.

    Just if you use JS make sure to google about how to do money calculations accurately.

  • by krishvs on 7/13/2021, 12:14:02 PM

    JVM or python just for the ecosystem. kafka, debezium, workflow engines(camunda), ORMs etc - java libraries might not be as nice to use as ruby gems but some are really rock solid for your use-case.

    But that being said, anything really - rails/node etc most would work just fine for your use-case

  • by ecesena on 7/13/2021, 10:52:48 AM

    I'm a fan of Kafka for streaming systems, so I'd recommend to fetch data via api, add it to kafka and then consume from kafka to monitor the price + fire alerts (assuming you don't really need a real-time system with very low latency).

    What language you use is up to you/your team. If you like python, I'd look into Airflow or Celery as frameworks to build your workflows.

    For java/scala I'd take a look at Google Dataflow or Flink or Kafka Streams, depending on how heavy the "monitor share prices" part is.

    I think the major issue you're going to encounter is not building a system that works, but building a process that allows your team to upgrade the system, test new features, debug, etc.

    Dataflow, for example, lets you run more or less the same code both on a stream of data and on batches of historical data.

  • by matt_s on 7/13/2021, 12:39:12 PM

    Generic Answer:

    - The tech stack that is best is the one you/team are most comfortable with using.

    Specific Answer:

    - If you were using Elixir, you could have 1 process (GenServer) per stock ticker that calls the API for just that stock. GenServers are lightweight processes that run concurrently in the BEAM. That process could then have a set of subscribing user processes to notify which would then do an alert to a user. This is assuming you can call an API for one stock vs. having to parse a feed of data. If I were doing this I wouldn't store any of the stock data in any way, its ephemeral and by the time a DB commit happens the price changed.

    Caveats:

    You mention API so I assume that means you are calling an API, parsing data, (dealing with rate limits) then alerting. I'd caution the use of the term "real-time" as you are farther away from trading data than big name brokers. My intent isn't to downplay the idea, just thinking it through based on the info in the one sentence question.

  • by tluyben2 on 7/13/2021, 6:48:54 PM

    I would use the new k(9) if I have a small team and we can pick. Otherwise .NET Core 5. We use this for all our fintech stuff and it is very good and 'enterprises approved'. We use it with Postgres and Redis.

  • by ewmiller on 7/13/2021, 3:19:44 PM

    I work in fintech; we use a mix of Scala and Node. Scala for heavy data processing pipelines, Node for our client API, due to Node’s very fast cold start times in lambda.

  • by Bayart on 7/13/2021, 8:12:24 AM

    Realistically, anything you're already comfortable with will work for that. I would pick Python, at least for a quick prototype.

    You can start thinking about an efficient implementation when you've got a defined feature set and are seriously thinking about delivering an actual product. And when that happens you can just use which tech works best for which type of service anyway.

  • by hactually on 7/13/2021, 8:43:17 PM

    For a system I built, which had some similarity, we used Go and postgres with Interactive Brokers running in a headless Docker container (with virtual X) to provide the ability to trade.

    It was pretty cool for something that took a couple of days and I believe it still works that way.

  • by murukesh_s on 7/13/2021, 7:51:45 AM

    I would recommend Nodejs for this use case and if other use cases come by such as machine learning or distributed transaction, you can split business logic as separate microservice using Python or Java as needed

  • by dehrmann on 7/13/2021, 8:02:19 AM

    What languages are you really good at? As another commenter said, unless you're doing ML, most general purpose languages can monitor an API and make API calls.

  • by mrap4 on 7/13/2021, 10:26:23 AM

    AWS SQS & Lambda

  • by christopher8827 on 7/13/2021, 10:56:40 AM

    Hey - is this essentially provisioning a bunch of background workers to constantly scan a database table? eg. one worker for each alert (to look for a price). How would someone incorporate this in Rails?