Quote:
Originally Posted by drexl
It should be backward compatible: fields in the response should be identical. If this is verified then it is a case of de-activating caching.
Filtering of data wasn't in v1 afaik. It still has to be done on our end.
Of course cb could just communicate. Maybe they will do it soon.
|
No, filtering of data was not in the original but the usage of it almost demanded either caching the xml or using a database. So filtering at our end made sense.
If they are going to go to an api call every page hit (which is the only way using the client ip properly could work imho) then we move away from caching on our end and depend on filtering on their end. Other things in there make me think that way such as the being able to request a certain number of records to skip. That would be to allow for pagination I would think.
My versions of the current usage are to pull everything into a database and then pull from there to make filtered cached files that my sites use.
I do 1 call to the api every 5 minutes and that loads out to 17 different sites, each one niched differently.
Changing that up to an api call on every page and then the filtering, etc will be a bitch and a half.
.