View Single Post
Old 11-26-2018, 06:30 PM  
Mr. Deltoid
Confirmed User
 
Join Date: Feb 2005
Posts: 132
Quote:
Originally Posted by sarettah View Post
To do this without access to CB's back end you have to use CB's api. so you can use either
the xml, json OR yaml versions of the api.

If you want random pulls from all available with the particular tag then you have to read through the entire api each time. If you just want the first 6 or 7 of the tag then it may or may not need to go through the entire feed.

You also will want to parse on gender I would imagine to help shield against people using bullshit tags.

So you need an api load that can go through the entire feed every time it loads. You will only want to load every now and again, like every 5 minutes. You do not want to load the full api every time the widget gets hit.

That needs to either write your data to files or store it in a db so you can reuse it.

So that requires a cron job for refreshing the api load on a regular basis.

So, so far 1 script for pulling and storing the api load and a cron job set up to run it.

You could have the same script write the files for the widget. If you are going to run more than one version at a time then you need a naming convention that you can use to pull the right files.

Or you can pull from the database each time you load the page so the widget can be different user to user with a different set of models.

If you do the latter then that is a second script. If you do the former it adds complexity to the api pull and load script.

Then, of course, you need a little bit of html and css coding in order to stylize your widget to look the way you want.

And then of course there is all the other stuff that you might be thinking that I have not hit yet.

Throw in testing and debug and you are wll over an hour or two unless you are expecting the developer to develop on top of something that they already have. For example, I would probably use an xml load that I developed that eats through the entire api, parses it out and loads it into the database in between 5 and 10 seconds depending on server load. But you don't get that for $100. That has been developed and refined across a 5 year period. Not sure how many hours are in it but it is very efficient and it is totally custom.

So, that is what I am seeing.

Now, if a developer has access to the CB backend directly then that is a totally different scenario.

.
Interesting, thanks for the outline. How can I get in touch with you?
Mr. Deltoid is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote