Menu

SEMrush Technical SEO Exam Answers 2018 PDF

We help you to get prepared and pass your Marketing online Certification exams. It’s a better way to learn what you need using our Top Quality Time-Saving Guides and practical tips and tricks. More info: https://www.certificationanswers.com/en/

SEMrush Technical SEO Exam Answers 2018 PDF
SEMrush Technical SEO Exam Answers


What elements should text links consist of to ensure the best possible SEO performance?
Anchor text, a-tag with href-attribute
Nofollow attribute, anchor text
a-tag with href-attribute, noindex attribute

What is link juice?
The number of links pointing at a certain page
The value a hyperlink passes to a particular webpage
Optimized website link hierarchy

What are two the most commonly known best practices to increase crawling effectiveness?
Multiple links to a single URL
Using linkhubs
Meta robots nofollow
Interlink relevant contents with each other
Internal, link-level rel-nofollow

Choose three statements referring to XML sitemaps that are true:
XML sitemaps must only contain URLs that give a HTTP 200 response
It is recommended to use gzip compression and UTF-8 encoding
There can be only one XML sitemap per website
XML sitemaps should usually be used when a website is very extensive
It is recommended to have URLs that return non-200 status codes within XML sitemaps

Choose a factor that affects the crawling process negatively.
Duplicate pages/content
A well-defined hierarchy of the pages
Content freshness

Choose two statements that are false about the SEMrush Audit Tool.
It can be downloaded to your local computer
It can’t audit desktop and mobile versions of a website separately
It provides you with a list of issues with ways of fixing
It allows you to include or exclude certain parts of a website from audit

What is the proper instrument to simulate Googlebot activity in Chrome?
Reverse DNS lookup
User Agent Overrider
User Agent Switcher

How often does the option to combine a robots.txt disallow with a robots.txt noindex statement make folders or URLs appear in SERPs?
Less than ones without noindex
Never
Occasionally

True or false? It is not possible to have multiple robots meta tags.
False
True

Choose two correct statements about a canonical tag:
It should point to URLs that serve HTTP200 status codes
It is useful to create canonical tag chaining
Each URL can have several rel-canonical directives
Pages linked by a canonical tag should have identical or at least very similar content

Fill in the blank. It’s not wise to index search result pages because _____.
Google prefers them over other pages because they are dynamically generated and thus very fresh.
they do not pass any linkjuice to other pages
those pages are dynamic and thus can create bad UX for the searcher

PRG (Post-Redirect-Get pattern) is a great way to make Google crawl all the multiple URLs created on pages with many categories and subcategories.
False
True

Choose the wrong statement.
It is important to have all sub-pages of a category being indexed
Proper pagination is required for the overall good performance of a domain in search results
rel=next and rel=prev attributes explain to Google which page in the chain comes next or appeared before it
Pagination is extremely important in e-commerce and editorial websites

You have two versions of the same content in HTML (on the website and in PDF). What is the best solution to bringing a user to the site with the full navigation instead of just downloading a PDF file?
Using the X-robots-tag and the noindex attribute
Introducing hreflang using X-Robots headers
Using the X-robots rel=canonical header

What does the 4XX HTTP status code range refer to?
Server-side errors
Client-side errors
Redirects

Check all three reasons for choosing a 301 redirect over a 302 redirect:
The rankings will be fully transferred to the new URL
Link equity will be passed to the new URL
To not lose important positions without any replacement
The new URL won’t have any redirect chains

When is it better to use the 410 error rather than the 404? Choose two answers:
When there is another page to replace the deleted URL
If the page can be restored in the near future
When the page existed and then was intentionally removed, and will never be back
When you want to delete the page from the index as quickly as possible and are sure it won’t ever be back
What is the best solution when you know the approximate time of maintenance work on your website?
Using the 503 status code with the retry-after header
Using the HTTP status code 200
Using the noindex directive in your robots.txt file
Using the 500 status code with the retry-after header

Choose three answers. What information can be found in an access-logfile?
The method of the request (usually GET/POST)
The request URL
The server IP/hostname
Passwords
The time spent on a URL

True or false? It is recommended to work with log files constantly, making it a part of the 20 SEO routine rather than doing one-off audits.
False
True

Which HTTP code ranges refer to crawl errors? Choose two answers.
2xx range
3xx range.
5xx range
4xx range

Choose two statements that are right.
It is not a good idea to combine different data sources for deep analysis. It’s much better to concentrate on just one data source, e.g. logfile
Combining data from logfiles and webcrawls helps compare simulated and real crawler behavior
If you overlay your sitemap with your logfiles, you may see a lack of internal links that shows that the site architecture is not working properly

Choose two answers. Some disadvantages of ccTLDs are:
They have strong default geo-targeting features, e.g. .fr for French
They may be unavailable in different regions/markets
They need to be registered within the local market, which can make it expensive

Choose two http response status codes that will work where there is any kind of geographical, automated redirect. We are talking about international requests from different geographical regions.
301 and 303
302 and 301
302 and 303

You have site versions for France and Italy and you set up two hreflangs for them. For the rest of your end-users you plan to use the English version of the site. Which directive will you use?
<link rel=”alternate” href=”http://example.com/” hreflang=”x-default”/>
<link rel=”alternate” href=”http://example.com/en” hreflang=”uk”/>
<link rel=”alternate” href=”http://example.com/en” hreflang=”en-au”/>

True or false? The SEMrush Site Audit tool allows you only to define issues that slow down your website and does not give any recommendations on how to fix them.
True
False

Choose two optimization approaches that are useful for performance optimization:
Avoid using new modern formats like WebP
Asynchronous requests
Increase the number of ССS files per URL
Proper compression & meta data removal for images

True or false? Pre-fetch and pre-render are especially useful when you do not depend on 3rd party requests or contents from a CDN or a subdomain.
True
False

Fill in the blank. According to the latest statistics, 60% or more of all results for high volume keyword queries in the TOP-3 have already been moved over to run on ______
HTTP
HTTPS
FTP

What are the two valid statements with regard to the critical rendering path (CRP)?
The non-critical CSS is required when the site starts to render
There is an initial view (which is critical) and below-the-fold-content
CRP on mobile is bigger than on a desktop
The “Critical” tool on Github helps to build CCS for CRP optimisation

Choose the correct statement about mark-up
Invalid mark-up still works, so there’s no need to control it
Even if GSC says that your mark-up is not valid, Google will still consider it
Changes in HTML can break the mark-up, so monitoring is needed

Choose a valid statement about AMP:
Using AMP is the only way to get into the Google News carousel/box
AMP implementation is easy, there’s no need to rewrite HTML and build a new CSS
CSS files do not need to be inlined as non-blocking compared to a regular version
A regular website can never be as fast as an AMP version

Fill in the blanks. When you want to use _____, make sure they are placed in plain HTML/Xrobots tags. _____ injected by JavaScript are considered less reliable, and the chances are that Google will ignore them.
rel=amp HTML tags
hreflang tags
Canonical tags

Which type of mobile website version should you use to check if the “user agent HTTP header” variable is included to identify and provide the relevant web version to the right user agent?
Responsive web design
Independent/standalone mobile site
Dynamic serving




The post SEMrush Technical SEO Exam Answers 2018 PDF appeared first on Certification Answers.