In this tutorial, I am going to be giving you a step-by-step tutorial on how to properly use GSA Website Contact for lead generation. GSA Website Contact is a very powerful mass contact form submitter / poster. In basic terms, it works by scraping the search engines for your list of keywords and then submitting your message to websites via contact forms.
I am very old school when it comes to marketing. I really like to use newsletters and emails whilst prospecting for new clients or sending out my promotional messages. However, the bitter truth is that emails are not enough on their own. A lot of the times, many emails end up in spam folders and are competing with thousands of other emails. It is indeed very easy for a busy webmaster or a business owner to overlook their emails. Equally, there are many factors that will determine the success of your email campaign which include but are not limited to: your domain and IP whitelist/blacklist status, your proxies, email/newsletter content and much more!
I like to submit my message via contact forms because it is a very effective way of getting your message across to webmasters and business owners. Most of the time, contact form submissions have a roughly 95% open rate simply because a website contact form submission is sent to the webmaster/business owner via their own SMTP server which in turn, guarantees a very high rate of delivery. Up until quite recently, I was doing a lot of website contact form submissions manually. However, this process is extremely slow, unrewarding and prohibitively expensive. GSA Website Contact is an automated mass website contact form poster that has saved me a lot of money in costs and allowed me to generate a lot of quality guest blog posts for my money sites and secure many lucrative clients.
Having briefly made the case for incorporating GSA Website Contact into a marketing strategy, let's now turn to GSA Website Contact.
When you click on "Options" inside the main interface, you will be presented with an option to configure the total amount of threads you would like to use for scraping "Scraping/Search" and submitting contact forms "Checking/Submission". I normally use anywhere between 100 to 200 threads on my 20 core/256GD dedicated server with the most powerful video graphics card. I have tried using 500 and 1000 threads but even the most powerful dedicated servers do crash or become sluggish so I took the thread count down to 100-200. You can "stop projects when no proxy is alive". I do not check this option because my list of private and public proxies is just too large for there to be no alive proxies. I leave the "HTTP time out" at 30 seconds default value. The "Global Website Blacklist" is an extremely useful option that I would strongly recommend to everyone. Sometimes, you may have a list of websites that have unsubscribed from your communications. It is very important to add these websites to the blacklist to avoid any potential legal ramifications. You could add almost any websites that you would not like to contact. You can add websites manually or you could just important a list of all website domains. I usually just add people who have unsubscribed from my communications as there are simply too many websites to add. If all of your projects are very similar in nature and are promoting the same thing, you may want to "disallow different projects to send to the same website". It is a no-brainer option, you do not want to spam one website with the same message. However, as most of my messages are very different in nature, I do not use this option.
Once you create a new campaign, under the "scraping" tab, you will see some options. Let's quickly run through them. At the left hand side corner you will see some proxy options: "use proxies for search engines" and "use proxies for parsing/sending". Both options are exactly what they say on the tin. I strongly recommend that you check both of these options as it is very important to be using proxies during both stages. When scraping, you need to use a lot of proxies because search engines will simply ban your IP if you are not using any proxies and you will not be able to scrape many websites to post to. It is equally as important to be using proxies for sending your contact form messages because your IP can become blacklisted for spam and you do not want that, especially if you are using your local IP address. Of course, you will need more proxies for scraping the search engines than submitting contact forms. I check both boxes and select "[all types]" which just means that you would be using private as well as public proxies. It is better to use both private and public proxies since it may not be enough just to have private proxies, especially if you are planning of scraping websites for tens of thousands of words.
Click the "Configure button in the top left hand side corner. This will open up a small window. Inside this window, you will be able to add your own private proxy list and test all your proxies to remove any dead proxies.
You can also add public proxy sources by clicking on the "Add/Edit ProxySites". On the left, you will see a densely populated list of public proxy sources/websites. You can check as many sources as you wish. I do not check all, only about 100 just to be on the safe side. Once I have selected my proxy lists, I leave the other values at their default settings.
Whilst you are inside the "Proxy List" window, click on "Options" button. Here, you can select to "Automatically search for new proxies every X minutes". I also like to check the "test proxies" box for "public only" proxies and "re-test previously working proxies". I choose Bing for the "test to perform". I like to have this option enabled for public proxies because public proxies are very unstable and temperamental, one minute they are working and the other, they are down. I also enable the "Automatically disable public proxies when detected to be down" option. I do not use the same option for private proxies because I more or less know when they are going to expire. However, it is good to also "Automatically disable private proxies when detected to be down". I leave the other options at their default settings. Proper proxy configuration is imperative to maximise the success rate of your campaigns.
Open the "Filter" tab. You can set up some filters for your target websites. These filters are very helpful in ensuring that you only scrape and contact niche-related websites. There are numerous filters that you could apply. You can "accept only websites with the following language" and you then select the languages. You can "accept only websites from the following country [sic]". You can also "Accept duplicate domains (different contact URLs on the same domain)". I can imagine that this would apply to marketplaces that have multiple vendors with their own contact forms, directories and forums. You can add your list of keywords for the "URL must contain" filter. This is a very helpful filter for scraping only URLs that have your keywords present. However, many domains and urls may be within a niche but not necessarily contain niche-related keywords. So use this option with caution so as not to limit your results. You could also use "Website must contain" option and add your set of niche-related keywords that must be present on the source page, contact page, HTML-Source or visible text area (you would have to check these options individually at the bottom. This is a very powerful filter for targeting niche-related sites with laster precision. In spite of these great filters, I still do not use them since I find that I get good results without them. Equally, given that my money sites and products are not niche-specific per se, I do not have a pressing need for these filters. However, this is not to say that you should not use them as they are great for increasing the focus of your campaigns.
Under the "Submission Content" tab, you will be able to configure your contact form information. I strongly recommend that you use "Self-defined" configurations by entering custom information instead of using random data for trust issues. For example, if you use one name and surname and quite another one comes up in the email then that automatically rings alarm bells. Equally, you do not want to be writing from a random address in China. Remember, it is all about instilling trust in the recipient so do take some time to configure your message. You can use spintax inside your message, but I generally do not bother. Spin syntax is very useful because some contact form submission platforms may flag up your message as spam if the same message is being sent to contact forms on its platform. Shopify is a good example of this.
As I discussed above, to use your own content for each field, double click on each "Content" column field and then under the "How to fill" option, select "self-defined", click ok and enter your content.
Click on the "Captchas" tab and you will see a window where you can add your captcha services. By default I use "GSA Captcha Breaker" and "2Captcha API with IP". Simply Start GSA Captcha Breaker and it will run in the background. I use 2Captcha API with IP with Botlabs' XEvil captcha decoder as it can solve Google image captchas which are EXTREMELY popular with contact forms.
If you plan to run "2Captcha API with IP" (like me), you will need to add an API key and a host. For the API key, you can just add 123 and for the host, you will need to add the host IP which appears under the settings tab in Botmaster Labs' XEvil Captcha Decoder. Ensure that Xevil and Xrumer are always running whilst you are using GSA Website Contact software.
Loading too many keywords for scraping websites will slow down GSA Website Contact software, especially when you select a campaign in the main window before starting it. Before you start a campaign, you need to click on the actual campaign. You will then see a message that will say something along the lines of "loading keywords". The delay will be proportional to the number of keywords you are using. The less keywords you have, the faster the campaign loads. From experience, 30,000 keywords is a good number to have since GSA Website Contact loads the campaign very quickly. I am always tempted to dump as many keywords into one campaign as possible so that I do not have to create multiple campaigns for every set of keywords. There is a way around this. Instead, create and configure one campaign without any keywords. Save it and then duplicate the campaign. Then add your keywords. Then duplicate the first (template) campaign and add your next set of keywords. Overall, this approach is very effective because you do not have to configure the campaign other than adding your keywords as you are using a template campaign that is already pre-figured.
I create many campaigns and I like to name my campaigns according to my website and the type of keywords/sites I am targeting. For example, I may have a campaign called "WLH Pinkys Celebrity Club - Shopping Sites". This means that I am promoting Pinky's Celebrity Club Instagram Bot to all shopping websites. Next, I could create a campaign called "WLH Pinkys Celebrity Club - Fashion Sites". This campaign denotes that I am promoting the software to a list of target fashion websites. By naming your campaigns properly, it will be easier to track and schedule your marketing campaigns.
My favourite way to get niche-related keywords is to use ahrefs. I would input the url/domain of a big website such as magazine for example and then download their organic keywords. Oftentimes, big sites rank for millions of niche-related keywords. This will immediately give me a massive list of relevant keywords that I can upload straight into GSA Website Contact. Alternatively, you could use keyword generating software to get some suggestions. I find ahrefs to be the quickest and most accurate method out of the lot.
Comments will be approved before showing up.