我有一个应用程序,它从我们的其他服务器之一获得JSON格式的数据。我面临的问题是,当请求这些信息时,会有很大的延迟。由于传递了大量数据(每个请求大约有1000条记录,其中每条记录都非常大),有没有一种压缩方法可以帮助降低速度。如果是这样,您会推荐哪种压缩方案。
我在另一个thread上读到,他们的数据模式对需要使用的压缩类型也很重要。数据的模式是一致的,如下所示
:desc=>some_description
:url=>some_url
:content=>some_content
:score=>some_score
:more_attributes=>more_data
有没有人能推荐一个解决方案来减少这种延迟。它们的延迟大约是6-8秒。我使用Ruby on Rails开发这个应用程序,提供数据的服务器大部分使用Python。
发布于 2010-03-26 18:25:21
我会先看看这8s延迟中有多少与:
- DB indexes
- caching
- a faster to\_json library
一些优秀的资源是Rails上的NewRelic播客,可伸缩性http://railslab.newrelic.com/2009/02/09/episode-7-fragment-caching
- if the keys are pretty much the same, you may implement the sollution from [Compression algorithm for JSON encoded packets?](https://stackoverflow.com/questions/395505/compression-algorithm-for-json-encoded-packets/395668#395668) ; You may want to look at [https://github.com/WebReflection/json.hpack/wiki/specs-details](https://github.com/WebReflection/json.hpack/wiki/specs-details) and [http://www.nwhite.net/?p=242](http://www.nwhite.net/?p=242)
- in addition to this, you may also compress (gzip) it from your frontend server [http://httpd.apache.org/docs/2.0/mod/mod\_deflate.html](http://httpd.apache.org/docs/2.0/mod/mod_deflate.html) [http://wiki.nginx.org/NginxHttpGzipModule](http://wiki.nginx.org/NginxHttpGzipModule)
- If the data structure is constant and you can also try to implement a binary service, that is much much faster, includes compression, but also more difficult to mantain, like thrift: [http://www.igvita.com/2007/11/30/ruby-web-services-with-facebooks-thrift/](http://www.igvita.com/2007/11/30/ruby-web-services-with-facebooks-thrift/)
- If this is suitable to your needs, maybe you can make some kind of a versioning/cache system server-side, and send only the records that were modified (but that is pretty heavy to implement)
发布于 2010-03-26 17:47:44
gzip可以显著减少文本数据的大小并优化加载速度。这也是YSlow推荐的。
https://stackoverflow.com/questions/2522204
复制相似问题