Jump to content

Recommended Posts

well, amazon does have a limit on a rate of API calls, so maybe a little botnet with separate API accounts would be in order... 

 

You also need to be careful that if you hit their API to much, too fast, they may cache your results.

Share this post


Link to post
Share on other sites

You also need to be careful that if you hit their API to much, too fast, they may cache your results.

 

very, very true. we use to run into that problem load testing client e-commence web sites during the 2000s (HP LoadRunner).  Luckily, LoadRunner implemented a non-caching setting but we still always verified on the WebLogic and TIBCO app servers..

Share this post


Link to post
Share on other sites

My API calls are divided into three priorities based on how often I want to poll a particular item.  Priority 1 gets polled at most every minute and is reserved for a handful of sets.  Priority 2 gets polled every 10 minutes or so.  Priority 3 gets polled every 30 minutes or so.  Every 10 seconds, I'll query up to 10 items in one API request -- the API says you are allowed one per second -- and I allow at least 4, 4, and 3 of Priority 1, 2, and 3 to make that list.  Even with 500+ items being checked, they are divided so that each can be queried at my desired rate without exceeding one per second.  So far, it has been running this way for a few months or so on my web hosting server without any complaints from the API.  My original plan was to make it adaptive.  Each priority has a range of time that it will check each item based on how volatile it becomes.  For example, if the item starts going in and out of stock, it'll start querying it more often.  If it doesn't change in awhile, it starts to query less often.  The hooks are in the scripts to handle the volatility parameter, but I never implemented the checks to update the volatility variable.

 

Right now the data is not used for auto-notification -- just dumped into a database of the last results -- so I can't say whether or not it is anymore reliable (or less cached) than the public web pages.  That would be an interesting test and it wouldn't be difficult to modify the php scripts to send me a SMS email when a rare item goes in stock and I'll compare it with Chrome Page Monitor's ability over time to see if one tends to beat the other. 

 

NowInStock and CamelCamelCamel use the same Amazon API, BTW, and is one reason why they can't do near-real-time queries as well -- same limitation on the number of queries per second, but they have to poll many, many more items.  Only polling for Legos -- and only on a handful of ones of particular interest -- has its advantages.

Share this post


Link to post
Share on other sites

Thanks everyone for your input!

I did a bit of searching for api scripts and cannot find any examples for monitoring store items. Can some of you share any input as to where to find how these tools work, and how to get started?

Share this post


Link to post
Share on other sites

that's right script guys - give out your secret codes you mastered through long hard work, trial and error, and hours of time spent in front of a screen.

give away your golden goose so that it works for no one anymore and set to annoy retail servers so they develop another barrier to make things more difficult for yourself.  also add a bunch of competition and lots of money to the mix and lose your advantage.  sounds like a great idea .

 

i don't code.  i don't hesitate

go all in when there is an opportunity.

i buy early and often.  

no code. no problem.

 

more useful than any code, this elaborate disguise helps throw pesky LEGO store managers off your track after they remember that you're the guy who bought 2 GEs and 2 T1s with cash no VIP 5 months ago and your are working towards mid  double digit tumblers.802923.jpg?zm=1600,1600,1,0,0

  • Like 3

Share this post


Link to post
Share on other sites

How often are you guys refreshing Amazon in order to get a bot warning? I haven't had issues with page monitor but I also don't refresh with really egregious intervals. I often set it to simply 30 seconds (note that this is too long of an interval, you'll never get anything with it) and dial it up to refresh about 10 times a minute during times when I expect items to come back in stock. If you read this forum enough you will notice trends in the times that people claim items are coming back in stock, so you can watch more closely during those times.

 

And I don't think anyone should share their scripts. Its unnecessary and it will just lead to: 1) Retailer servers being brought to their knees, and subsequently 2) retailers caching more/limiting page refreshes/etc. In other words, it will just hurt everybody.

 

In my experience, a certain amount of attention from the user is required because even with an automated refresh solution, you need to be at the computer, with the product page open in a browser tab to even have a shot at getting it, and even then it is difficult. So for those of you expecting an automated solution to do all the work for you, you'll be disappointed.

Share this post


Link to post
Share on other sites

@sjbdeebo2

 

Scraping (Extracting raw data from a page)

1, Download the page or if using an API, query for the product.

2, Parse the HTML using XPath or CSS selectors. The SelectorGadget add on is useful for working out XPath to elements in a page. You'll need to find a library to do this such as Nokogiri for Ruby. If you are using an API you have to basically do the same thing using XPath or parsing JSON to get the data you want.

3, Extract the price value and compare against a list, CSV or Database using Regex such as [0-9.]+

4, Send a notification, sound an alert whatever.

 

Thats basically the process. Simple right  8)

 

Also if you are going to use Amazon Product API, try to find a library that handles the requests, that thing has tight security , had to write 150 lines of ObjC just to generate the URL with Auth last week.

 

@TheBrickClique

 

How much have you benefited from your program? Have you scored some great deals from it?

  • Like 1

Share this post


Link to post
Share on other sites

 

@sjbdeebo2

 

Scraping (Extracting raw data from a page)

1, Download the page or if using an API, query for the product.

2, Parse the HTML using XPath or CSS selectors. The SelectorGadget add on is useful for working out XPath to elements in a page. You'll need to find a library to do this such as Nokogiri for Ruby. If you are using an API you have to basically do the same thing using XPath or parsing JSON to get the data you want.

3, Extract the price value and compare against a list, CSV or Database using Regex such as [0-9.]+

4, Send a notification, sound an alert whatever.

 

Thats basically the process. Simple right  8)

 

Also if you are going to use Amazon Product API, try to find a library that handles the requests, that thing has tight security , had to write 150 lines of ObjC just to generate the URL with Auth last week.

 

@TheBrickClique

 

How much have you benefited from your program? Have you scored some great deals from it?

 

 

Thanks! Now I have an idea how this works. I will have to do some studying from here :)

Share this post


Link to post
Share on other sites

key is being around the computer

 

LOL

 

sms to your positive results to your smartphone & you never have to sit at your computer again... :-)

Share this post


Link to post
Share on other sites

I wrote my script yesterday for both Amazon and Walmart. I'm wondering, do you have your scripts buy the item for you when it comes in stock? Or just notify you? I was a little hesitant to have mine auto buy stuff for me...

 

 

I'm using multiple proxies and randomly sampling between 4 user agents to throw off their bot detection. It ran overnight and all day today and haven't had an issue yet. It's checking every 5 seconds. Right now if the price is what I want, and the item is in stock, my computer volume is set to max and then speaks and says "Price alert!". And the webpage is opened on Chrome. It also sends an SMS to my phone. I had a long lunch break :)

  • Like 2

Share this post


Link to post
Share on other sites

Does anyone has any specifics on cache times or have any hints on a way to avoid it?

Reddit said they cache requests for 30s. I'd prefer to poll quicker than this.

I was running GET() calls with no-cache headers and must have still been receiving cached results. To get around this i switched to POST requests however walmart appears to auto deny POST requests with a 405 error.

Share this post


Link to post
Share on other sites

I realize some of you may want to keep this info private but I'm trying to step up my game and learn what some of the pros are doing to stay ahead of the noobs. I'm aware of keepa and camelcamelcamel for Amazon and I've looked into some web monitor chrome extensions but it seems like there are other tools I may be unaware of. 

 

How are you guys being notified when items come back in stock or price drops on lego, toysrus, walmart, target, etc? Or even when items that were gone return?

 

Anyone have info on Amazon auto-buy scripts? I've done some googling but not finding much info.

 

I'm actually a relatively new lego investor but would like to put some of this info into practice for other areas as well. I'm not looking for someone to hold my hand I just need to be pointed in the right direction and I'll dig deeper.

Share this post


Link to post
Share on other sites

This is a quick emergency trip. Will be back in like 2 days. Just found out this morning I needed to go. I thought I was done for a while. Oh well. At least the boss got us upgraded to business class :)

Share this post


Link to post
Share on other sites

I am on a plane now heading to the UK. When I get back I will finish up the notifications and other upgrades to the stock track that I am working on.

Be safe and Merry Chistmas!

Share this post


Link to post
Share on other sites

Sorry to bring this thread up again but need a little advice please, I know you guys don't want to give out to much info on your scripts but I have one run running in England and it looks like amazon caught me and now I have a robots page where I have to stick a code in to get back into Amazon each time so the script won't work anymore.  I am only running each 60 seconds so thought it wouldn't be a problem, has this happened to anyone else?  I got it on a VPN now but wonder if its because I was running from a UK IP to Amazon US?     Thank you Stu

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...