• Top
  • New

Ask HN: Please, help me understand how I can improve my workflow

by iamjeff on 1/24/2018, 5:33:24 PM with 0 comments
Hi, HNers!

I desperately need help scraping the Top 20 URL links (and titles) from Google's SERPs for a bunch of keywords.

Why I need the URLs and titles:

I am launching a personal blog (my very first site) and need to understand the i) type of content that ranks and ii) design a backlink strategy that has a chance of working (by reverse engineering backlinks of future competitors).

Having a bunch of links that I can then plug into ahrefs.com will help me out a bunch (money is tight so I will be using ahrefs $7 7-day trial).

What I have already tried:

- SERP Scraper from urlprofiler.com - Unfortunately, it took me ~20 hours to get the top 10 URLs and titles for 195 keywords. I cannot use proxies because I need to see SERP results from my country code TLDs (co.ke). Due to the inefficiency of this setup, stopped using this two days ago.

- Quotes from Fiverr and Upwork were between $65-$200. Unfortunately, I cannot afford professional help at the moment.

What I am trying:

- I am using a combination of Linkclump Chrome extension+ MurGaa Recorder for Macintosh + Excel.

- Here's my present workflow using these tools: Copy paste one keyword into Google.co.ke and search >> Trigger Linkclump action (copy to clipboard) with shortcut key recorded using MurGaa recorder >> Switch windows to Excel >> Copy paste SERP links to Excel >> Repeat for the next keyword.

This process is quite inefficient but faster than the first setup.

Why I need help:

I am not coding literate hence cannot roll own solution.

I do not have the money to buy a commercial solution.

How you can help me out:

Suggesting a solution or tip. For instance, is it possible to automate my present workflow by using Mac automation scripts? Which ones?

Hope that I was not too verbose. I also do not want to come across as demanding help or free stuff. I just need help if at all possible.