Facebook Twitter Gplus LinkedIn YouTube E-mail RSS
Home Application Architecture Network latency vs. end-to-end latency

Network latency vs. end-to-end latency

Geva Perry wrote an excellent blog on Extreme Transactions processing on wall street.

"So basically you now have thousands and thousands of machines buying and selling stocks and other securities from other machines based on extremely complex (and automated) computer models. So it has become a latency game — low-latency, that is."

When it comes to low latency we mostly look at the networking level and how we can push data at the speed of light. We often forget that the applications that need to consume that information are a critical piece in the information distribution food-chain. It is therefore not surprising that while we have good answers to deliver data faster than ever at the networking level, enabling the applications to consume the data effectively is still a challenge.

There is a huge difference between the network latency and the end-to-end latency, i.e., how much time it took from the point the information arrived through the network until the point a trader sees it in his desktop application. Normally, the steps that are involved in the process include enrichment (turning the incoming data into a more meaningful and consistent format), filtering and distribution of the right  data  to the right consumer.

Decreasing end-to-end latency is the bigger challenge IMO and it can only be achieved if we provide provide an architecture that covers all aspects of the data distribution.

There are basically two things that we're doing at GigaSpaces to address the end-to-end latency challenge:

  1. Create a processing and data grid that will enable to store the data that arrives from the stream and maintain the current state of the market in real-time.
  2. Enable efficient server side filtering that will ensure that only the relevant data will be sent to the consumer through a continuous query approach.

 
With this approach end-to-end low-latency is achieved because we are able to reduce unnecessary network hops due to the fact that the processing is done in-memory with the data.

Another important challenge is how to keep the latency consistent during peek load events. We  can scale  the data and the processing at the  same time through partitioning of what we refer to as Processing Units — a core piece in our Space-Based Architecture.

Since we maintain current state in a Data Grid, we're able to filter data effectively through indexing. With this approach, we can even handle slow-consumers effectively, in a way that will not affect the overall latency as is the case in many messaging based systems.

End-to-end-latency depends on lots of elements beyond pure networking, which have greater effect on latency than the network itself. It can be addressed effectively only if we apply an architecture that addresses all of these aspects.

Space Based Architecture was meant to provide such an architecture.

 
 Share on Facebook Share on Twitter Share on Reddit Share on LinkedIn
8 Comments  comments 
  • Pingback: Geva Perry's Blog

  • http://www.linkedin.com/in/heartburnhomeremedy Heartburn Home Remedy

    Hey, nice tips. Perhaps I’ll buy a bottle of beer to the man from that forum who told me to visit your site :)

  • http://endreflux.com indigestion reflux

    I often keep away from foods that are high in fat.

  • http://www.sinkobe.com forklift

    Kudos for posting such a useful blog. Your weblog is not only informative but also extremely artistic too. There normally are very couple of individuals who can write not so simple articles that creatively. Keep up the great work !!

  • http://www.prbuzzer.com/2010/08/05/25/Resort+Interviews+Plan+Your+Vacation Sally Watson

    A decent read. I’ll come back to this and add it to my favorites when i’m on my home comp

  • http://jesseFa26.terapad.com/index.cfm?fa=contentNews.newsDetails&newsID=676458&from=list Nieves Maslowsky

    I simply love your weblog! Very good post! Still you are able to do many things to enhance it.

  • http://joylptcqzksd.com Abbie Calowell

    When one conceives the situation at hand, i’ve to agree together with your finishes. You distinctly show cognition about this topic and i have much to understand right after reading your post.Numerous salutations and i will come back for any further updates.

  • http://www.gemmyairblowninflatable.com inflatable fishing kayak

    this man tryna get a bounce house at they wedding hell naw

© GigaSpaces on Application Scalability | Open Source PaaS and More