Are you still looking for an affordable Cloud Web Hosting Service Provider with a particular emphasis on speed, scalability and security? Then may be for you!

How do I change my URL in WordPress and Why is it so Important to check for broken Links

You can easily change the Site URL for your WordPress page any time you want, but there are a few important adjustments that must be made before making the switch.

Steps to follow:

1. Change WordPress URL in WordPress Admin Dashboard.

  1. After downloading all your site files to your local PC and then moving them to your new URL, you will need to log into your WordPress Dashboard as an administrator.
  2. Next, you will click on Settings from the menu, and then General. …
  3. Then you will enter the new (URL) Addresses on the two fields as shown below.
  4. The next step will be to scroll down the page and click on the Save Changes button.

Now, there is a chance where WordPress Address and Site Address are not editable. In other words, If the fields are not editable (as seen below), it’s most likely because they are hard-coded in wp-config.php file.

So if that is the case then, changing WordPress URL in wp-config.php file may be the solution for it.

2. Change WordPress URL in wp-config.php File

The second most common way to change your WordPress URL as we have previously mentioned, is via your wp-config.php file. It is important to note that values in wp-config.php override the settings in method one.

3. Change WordPress Site URL Via phpMyAdmin

  1. Log into the cPanel account for the domain in question.
  2. Click on the “phpMyAdmin” icon within the “Databases” section.
  3. Expand the database associated with the WordPress installation and select the “wp_options” table.
  4. Search for “siteurl” within the “option_name” column and click the “Edit” link on the same row.

So now that you have completed all the necessary changes and updated everything with your site, this should be the time to check for broken links.

I usually use broken-links-finder from

Why having broken web links is so bad and why you should fix it?

Dead hyperlinks on websites are not just annoying, their existence may cause some real damage to your online business as well as to your reputation in the Internet!

Because of that same situation, a web-site owner may:

  • Lose some of the existing customer base (current users may sooner or later get frustrated enough to never come back);
  • Get problems with potential clients (because of the dead web links people simply won’t be finding things/pages they are looking for);
  • Damage your site reputation (most of online customers consider stale hyperlinks as demonstration of no respect to them from the site’s owners);
  • Have negative impact on your website’s ratings with major Search Engines like Google, Yahoo, Bing etc…

As you can see, broken links can be problematic for website visitors, making them unable to access the desired resource or information. These users may decide to make use of another site to find the necessary information elsewhere. A site that hasn’t been updated or checked for a long time may suffer from link rot, which is a term used to describe a site with dozens of broken links.


Make sure to check your site regularly using SEO Tools like Broken Links Finder provided FREE of charge from

With the Broken Links Finder, you can check your site and correct 404 error pages you may have without realizing that you had one.

The Importance of SEO Tools to Web Professionals

We all know how important SEO Tools are to Search Engine Optimization Professionals. It’s virtually impossible to get any SEO task done without the help of a tool. But with the wealth of choices available now-a-days in the market, how do you know what SEO Tool is best for which SEO task?

Seo Prime Tools Team had to do something about it. Since our aim is to make Search Engine Optimization (SEO) easy for everyone, we thought, why not provide the very same tools we utilize on a daily basis to check our site status and more, and make it all available in one place so that anyone can use it to improve their sites as well?

We’ve decided to compile over 60 Best Free SEO Tools that can help not only Website Owners and Webmaster, but also SEO Professionals get tasks done with efficiency and accuracy – ultimately making you more productive and successful!

Having the right SEO Tool for a specific task is crucial in a profession that’s as time-consuming and resource-intensive as Search Engine Optimization.

Can you imagine generating a simple XML Sitemap manually?
The answer is simply “NO”, and honestly speaking, we can’t either! With SEO Tools such as; XML Sitemap GeneratorMeta Tag GeneratorMeta Tags AnalyzerRobots.txt Generator and many other tools available at, website owners and SEO Professionals can accomplish their tasks in a matter of seconds; something that would otherwise have taken over 10 to 20 min or so if done manually.

Think about it! A simple Free SEO Tool like, Robots.txt Generator for example!

Did you know that apart from creating robots (or so called User-Agents) that are used by Search Engines to crawl web content, robots.txt file can also define which parts of a domain can be crawled by a robot or not?

Yes, that’s right. The robots.txt file is a simple text file (no html) that is placed in your website’s root directory in order to tell the search engines which pages to index and which to skip. It gives instructions to the search engine by telling them not to visit pages residing in the file. It helps to ban the bot crawlers from entering to private content or folders.

This instruction is useful:

  • If you want search engines to ignore any duplicate pages on your website;
  • If you don’t want search engines to index your internal search results pages;
  • If you don’t want search engines to index certain areas of your website or a whole website;
  • If you don’t want search engines to index certain files on your website (images, PDFs, etc.);
  • If you want to tell search engines where your sitemap is located.

So here, you have the option to either create it manually, which could take you more time, especially if you had some tasks to accomplish in a short period of time, or simply use the Robots.txt Generator tool and do it in a matter of minutes or less.

How Robots.txt Work  
Search engines send out tiny programs called “spiders” or “robots” to search your site and bring information back to the search engines so that the pages of your site can be indexed in the search results and found by web users.

Your Robots.txt file instructs these programs not to search pages on your site which you designate using a “disallow” command.

# robots.txt generated by
User-agent: *
Crawl-delay: 10
Disallow: /cgi-bin/
Disallow: /images/

Article source:

About Us

We are a small team of well-trained professionals, dedicating our time to web design & development, graphics & logo design, and online marketing services for all types of businesses.