
You have spent years publishing blog posts, landing pages, and resource guides. Your content archive has grown to hundreds of URLs. And yet your organic traffic has plateaued – or worse, quietly declined – despite a consistent publishing schedule. If this situation feels familiar, the problem almost certainly is not that you have not published enough content. It is that a significant portion of what you have already published is working against you.
Thin posts that never ranked are consuming crawl budget. Old articles with outdated statistics are eroding your site’s credibility with search engines. Two separate posts targeting the same keyword are competing with each other, splitting authority and suppressing both from the first page of results. These are the kinds of problems that no amount of new content can fix – they require a content audit.
The word ‘audit’ makes many content marketers reach for their wallets. Platforms like Semrush, Clearscope, and Ahrefs offer powerful content audit features, but their price tags – often hundreds of dollars per month – put them out of reach for independent creators, small marketing teams, and growing SMBs. The good news is that a rigorous, actionable content audit does not require any of them. Using a combination of Google Search Console, Google Analytics 4, Screaming Frog’s free tier, Ahrefs Webmaster Tools, and a Google Sheet, you can build a complete picture of your content’s health and a prioritised action plan to improve it – at zero cost.
This guide walks you through the entire process: what a content audit is, why it matters for your rankings and traffic, how to collect and organise the right data using free tools, how to make the right decision for every URL on your site, and how to avoid the most common mistakes that cause audits to produce no lasting results. By the time you finish, you will have a reusable framework you can run every year – and a clear first action to take this week.
What Is a Content Audit – and Why Does It Matter?
A content audit is a systematic evaluation of every piece of published content on a website, assessed against a defined set of performance criteria. Unlike a content inventory – which is simply a raw list of all URLs and their metadata – a content audit adds a layer of judgement. Every page is reviewed not just to confirm that it exists, but to determine whether it deserves to exist, and in what form.
The evaluation typically covers three dimensions. The first is SEO performance: how much organic traffic does the page receive, where does it rank for its target keywords, and how many external sites link to it? The second is content quality: is the information accurate and up to date, is the depth appropriate for the topic’s competitiveness, and does the page fully address the intent behind the queries it targets? The third is strategic alignment: does the page serve your current business goals and audience, or is it a relic of a strategy you abandoned two years ago?
The output of a well-run content audit is not a list of problems – it is a prioritised action plan. Every URL on your site gets assigned one of five decisions: keep and promote, update and improve, consolidate with another page, redirect or remove, or leave alone for now. That action list is what transforms an audit from an academic exercise into a direct lever on your organic traffic.
Why Content Audits Have a Direct Impact on SEO
Google has been explicit, through multiple algorithm updates, that the overall quality of a site influences how its individual pages are ranked. The 2022 Helpful Content Update specifically targeted sites with high proportions of content that was created primarily for search engines rather than genuine human readers. If a meaningful percentage of your published pages are thin, outdated, or redundant, that is not just a problem for those individual pages – it is a drag on the authority of your entire domain.
There are also more specific mechanisms through which low-quality content suppresses rankings. Keyword cannibalisation – where two or more pages on the same site target the same primary search query – forces Google to choose between them, typically resulting in neither page performing as strongly as a single, authoritative piece would. Crawl budget, the finite number of pages a search engine bot will process on any given site visit, is consumed by low-value URLs that could otherwise be spent crawling and indexing your best content. Orphaned pages – those with no internal links pointing to them – are effectively invisible to search engines regardless of how well they are written.
A content audit addresses all of these issues simultaneously. It is arguably the highest-ROI content activity available to any team that has been publishing for more than 12 months, precisely because it improves existing assets rather than requiring the creation of new ones.
When Should You Run a Content Audit?
There are two types of content audit triggers: routine and reactive. Routine audits should be scheduled annually for sites publishing fewer than two posts per week, and every six months for higher-frequency publishers. Reactive audits should be triggered by specific events that suggest something structural has changed in your content’s performance.
The clearest signal for a reactive audit is a significant, sustained traffic drop – defined as a decline of 15% or more in organic traffic over a four-to-six-week period that cannot be explained by seasonality. Other strong triggers include a confirmed Google algorithm update that affected your niche, a planned site migration or domain change, and a strategic pivot that has left a large portion of your existing content misaligned with your current audience or business model.
The Free Tool Stack: What You Need Before You Start
One of the most persistent myths in SEO is that a meaningful content audit requires an enterprise-grade platform. The reality is that the data you need for 90% of content decisions is available through tools that cost nothing. Before diving into the process, it is worth understanding what each tool provides and why it belongs in your audit stack.
| Tool | Cost | What It Does in a Content Audit | Best For |
|---|---|---|---|
| Google Search Console | Free | Clicks, impressions, avg. position, CTR per URL | Traffic & ranking data – essential |
| Google Analytics 4 | Free | Page views, engagement time, bounce, conversions | User engagement & goal tracking |
| Screaming Frog SEO Spider | Free (up to 500 URLs) | Full site crawl: broken links, redirects, meta data, word count | Inventorying all URLs quickly |
| Google Sheets | Free | Central audit tracker combining all data sources | The audit master spreadsheet |
| Ahrefs Webmaster Tools | Free (own site only) | Backlinks, referring domains, top pages by authority | Link data without a paid plan |
| Google PageSpeed Insights | Free | Core Web Vitals: LCP, CLS, FID scores per URL | Technical performance checks |
Google Search Console is the indispensable foundation of any content audit. As a first-party tool provided by Google itself, it gives you the most accurate available data on how each page on your site is performing in Google search – the exact number of clicks, the number of times each URL appeared in results (impressions), the average position it ranked, and the click-through rate. No third-party tool, regardless of price, can match the accuracy of this data for your own site.
Google Analytics 4 complements Search Console by providing the engagement picture: what visitors do once they land on a page. Time on page, bounce behaviour, scroll depth, and conversion events tell you whether the traffic a page receives is actually finding what it is looking for – a dimension that Search Console cannot measure. Screaming Frog SEO Spider handles the technical inventory: crawling every URL on your site and surfacing broken links, redirect chains, duplicate content, missing meta data, and word count data in minutes. The free version handles sites up to 500 URLs, which covers the majority of SMB content libraries.
Ahrefs Webmaster Tools, available free for verified site owners, provides the backlink dimension that the Google tools cannot: which pages on your site have external links pointing to them, how many referring domains each page has, and which of those domains carry the most authority. This data is critical when making removal or consolidation decisions – a page with zero traffic but five referring domains from authoritative sites requires a fundamentally different decision than a page with zero traffic and zero backlinks.
The 6-Step Content Audit Process
The process below is designed to be followed sequentially. Each step builds on the last, and the data from each tool is combined into a single master spreadsheet that becomes your audit command centre. By the end of Step 5, every URL on your site will have a decision assigned to it. Step 6 converts those decisions into an actionable, prioritised plan.
Step 1: Crawl Your Site and Build the URL Inventory
Open Screaming Frog SEO Spider, enter your domain name, and run a full crawl. For most SMB sites, this takes between two and fifteen minutes depending on the size of the site and your connection speed. Once the crawl is complete, filter the results to show HTML pages only – exclude images, CSS files, JavaScript, and other non-content resources – and export the data to a CSV file.
The columns you need from this export are the URL, the page title, the meta description, the word count, the HTTP response code, the indexability status, and the canonical URL. Paste these into a new Google Sheet – this will become the foundation of your audit master tracker. Before moving on, address two things immediately: any URLs returning a 404 (page not found) error should be flagged for urgent attention regardless of anything else, and any redirect chains longer than one hop should be cleaned up as a technical hygiene priority.
Remove non-content URLs from your working sheet at this stage. Login pages, privacy policy pages, thank-you confirmation pages, and any other functional or legal pages that are not intended to rank in search should be excluded from the content evaluation. You are building a list of pages that are competing – or should be competing – for organic search visibility. Keep only those.
Step 2: Pull Traffic and Ranking Data from Google Search Console
In Google Search Console, navigate to the Performance section and select Search Results. Set your date range to the last 12 months – this gives you a full annual view that smooths out seasonal fluctuations and provides a statistically reliable picture of each page’s performance. Make sure all four metrics are toggled on: clicks, impressions, CTR, and average position. Export the full data set to CSV.
Back in your Google Sheet, use VLOOKUP or XLOOKUP to merge the Search Console data against your Screaming Frog URL inventory, matching on the URL column. Each row in your sheet should now contain both the technical data from the crawl and the traffic data from Search Console. This is the moment your audit begins to take shape – you can now see, at a glance, which pages have strong technical health but no traffic, which pages have traffic but are technically compromised, and which pages are invisible to search entirely.
Create a Traffic Tier column using a simple formula to band your pages into three categories. Pages receiving 500 or more organic clicks per month are Tier 1 – your highest-value assets, to be protected and built upon. Pages receiving between 50 and 499 clicks are Tier 2 – growth candidates worth optimising. Pages receiving fewer than 50 clicks are Tier 3 – the category requiring the most individual decision-making and the one most likely to contain your site’s quality debt.
Step 3: Add Engagement Data from Google Analytics 4
In GA4, navigate to Reports, then Engagement, then Pages and Screens. Set the same 12-month date range as your Search Console export to ensure the data is comparable. The metrics you need are page views, average engagement time, and conversions or goal completions if these are configured for your site. Export this data and merge it into your audit sheet alongside the Search Console columns.
The engagement data adds a dimension that traffic numbers alone cannot reveal. A page that receives 2,000 organic clicks per month but has an average engagement time of 18 seconds is attracting visitors who are leaving immediately – either because the content failed to deliver on its title’s promise, or because the search intent behind the query is not what the page was designed to serve. Conversely, a page with modest traffic but an average engagement time of four minutes and a meaningful conversion rate is a high-value asset that deserves far more promotion than it is currently receiving.
Two patterns in this data deserve immediate attention. The first is high traffic combined with very low engagement – this almost always indicates an intent mismatch that needs to be resolved either by rewriting the content to serve the actual query intent or by targeting a different keyword. The second is high engagement combined with low traffic – this indicates strong content that is poorly optimised or poorly promoted, and represents one of the easiest wins available in any content audit.
Step 4: Check the Backlink Profile in Ahrefs Webmaster Tools
Log into Ahrefs Webmaster Tools and navigate to Site Explorer for your domain. Under the Top Pages report, you will find a ranked list of your URLs ordered by the number of referring domains – external websites that link to each page. Export the top 100 to 150 pages by referring domains and merge this data into your audit sheet.
The backlink data changes the calculus for many content decisions in ways that traffic data alone would miss. A page with zero organic traffic over the past 12 months looks like a clear removal candidate – until you discover it has 23 referring domains from relevant, authoritative websites. Removing that page without first redirecting it to an appropriate replacement would destroy link equity that took years to accumulate. The rule of thumb is simple: any page with one or more quality backlinks should be redirected rather than simply deleted, regardless of its traffic performance.
The backlink audit also surfaces an often-overlooked opportunity: pages that have been deleted or are currently returning 404 errors but still have live backlinks pointing to them. These represent dead link equity – authority that is being sent to your site but landing nowhere. Each one should be identified and redirected to the most relevant live page, immediately recovering that link value.
Step 5: Categorise Every URL Using the Decision Framework
With all four data streams now merged into your audit sheet, you have everything you need to make an informed decision about every URL on your site. The framework below reduces every possible situation to one of five categories. Work through your Tier 3 pages first – they have the least traffic to lose and represent the highest concentration of low-quality content. Then move to Tier 2, identifying update priorities. Tier 1 pages should be reviewed last, with the goal of protecting and amplifying their performance rather than making structural changes.
| Content Status | Criteria | Action | Priority |
|---|---|---|---|
| Keep & Promote | Top traffic, strong rankings, on-brand, evergreen | Add internal links; refresh CTAs; update date if edited | High – do this first |
| Update & Improve | Page 2–3 rankings, outdated data, thin or incomplete | Expand depth, refresh stats, re-optimise title & meta | High – biggest ROI |
| Consolidate | 2+ posts on same topic, keyword cannibalisation | Merge into one pillar post; 301-redirect all merged URLs | Medium |
| Redirect / Remove | Zero traffic, no backlinks, irrelevant or duplicate | 301 to most relevant URL; 410 if truly no value or links | Medium – clean up |
| Leave Alone | Low traffic but niche value; no update needed now | Monitor quarterly; revisit at next audit cycle | Low |
As you work through the categorisation, document not just the decision but the reasoning. Six months from now, when you or a colleague revisits this audit sheet, the rationale column is what separates a useful historical record from a list of unexplained actions. The few extra seconds it takes to note ‘no traffic, no backlinks, duplicate of /blog/email-marketing-tips, redirecting to that URL’ will pay significant dividends the next time the audit is updated.
Step 6: Build the Prioritised Action Plan
The final step converts your categorised spreadsheet into an actionable 90-day sprint. Sort your sheet by the combination of priority level and potential traffic impact – pages ranked in positions 6 through 20 with high impression counts are your most immediately actionable opportunities, because they are already visible to search engines and need only marginal improvement to break onto page one.
Identify your top 10 to 15 update priorities for the first 90 days. For each one, document the specific action required: which sections need expanding, which statistics need updating, which internal links need adding, and what the revised target keyword and meta title should be. Assign ownership for each task, set a realistic deadline, and define what ‘done’ looks like – because ‘update this post’ is not a task, it is a wish. ‘Expand the section on email automation tools to 400 words, update all statistics to 2024 sources, and add three internal links from related posts’ is a task.
For consolidation and removal tasks, add one additional step: verify the redirect plan before executing any changes. For every URL you are removing or redirecting, confirm the destination URL in your sheet. For every URL you are consolidating, confirm that the surviving page is either already strong enough to absorb the merged content or will be rewritten to that standard before the redirects go live. Redirecting traffic and link equity to a weak page is not an improvement – it is just a relocation of the problem.
Content Decisions in Detail
The five-category decision framework works best when each category has clear, consistent criteria applied to it. The following section explains exactly what each decision means in practice, what actions it requires, and what common mistakes to avoid when executing it.
Keep and Promote
Pages in this category are your site’s best-performing assets – consistent top-10 rankings, meaningful monthly traffic, strong engagement data, and relevance to your current strategic priorities. The temptation with these pages is to leave them entirely alone, on the assumption that if it is not broken, it does not need fixing. This is a mistake. Your best-performing pages deserve active investment, not benign neglect.
The most valuable action for a high-performing page is improving its internal link profile. Identify five to ten newer pages on your site that cover related topics and add a contextual link from each of them back to this high-authority page. This concentrates link equity where it is most likely to produce compounding returns. Beyond internal linking, review the page’s calls to action – if the page is ranking for a commercially relevant query and driving significant traffic, an outdated or absent CTA is leaving conversion on the table.
Update and Improve
This category typically offers the highest return on effort in any content audit. These are pages that are ranking – usually on pages two and three of search results – but are not yet performing at their potential. The gap between their current ranking and the first page is most often a quality gap: the page covers the topic adequately but not comprehensively, contains outdated information, or lacks the depth and structure that the current top-ranking pages offer.
The most effective update process starts with a gap analysis rather than a word count target. Open the top three pages currently ranking for your target keyword and identify every subheading, question, or subtopic they cover that your page does not. That gap list is your update brief. Add only what is genuinely missing – padding a 600-word post to 2,000 words with filler content does not improve its ranking; it dilutes it. Alongside the content gaps, update every statistic that references a year more than 18 months ago, refresh any tool or platform mentions that are no longer accurate, and revise the meta title and description to reflect current search intent more precisely.
Consolidate
Consolidation is the appropriate action when two or more pages on your site cover the same topic at a level of depth that is insufficient to justify separate existence. The most common symptom is keyword cannibalisation: in Google Search Console, you will see multiple URLs sharing impressions for the same primary query, with neither page consistently ranking above the other. The solution is to choose the stronger of the two pages as the survivor, rewrite it as a comprehensive, definitive piece on the topic incorporating the best elements of all the pages being merged, and issue 301 permanent redirects from all merged URLs to the survivor.
The most important rule in consolidation is to check backlinks before deciding which URL survives. If the page you were planning to retire has significantly more referring domains than the one you planned to keep, it should become the survivor – or at minimum, the redirect should flow from the weaker page to the stronger one. Backlink equity is not perfectly preserved through redirects, but it is substantially preserved, and it is always better than the alternative.
Redirect and Remove
Removal is appropriate when a page has received no meaningful organic traffic in the past 12 months, has no backlinks from relevant external sites, does not serve any user intent that your other content fails to cover, and is not part of any active promotional or paid traffic strategy. Every page that meets all four of these criteria represents a small but real drag on your site’s overall quality signal – and removing it with a properly issued redirect or HTTP 410 response is a legitimate SEO improvement.
The mistake most teams make with removal is being either too aggressive or too timid. Too aggressive means deleting pages with even one or two quality backlinks without redirecting them – an easy way to permanently lose link equity. Too timid means keeping every page because removing content feels risky, which leaves the quality problem unsolved. Use the four-criteria check above as your standard: if a page fails all four tests, remove it. If it passes even one – particularly the backlink test – redirect it rather than deleting it outright.
Common Mistakes That Derail Content Audits
Content audits fail more often from strategic and process errors than from technical ones. Understanding the most common failure modes before you start is the simplest way to avoid them.
Auditing Without a Defined Goal
‘Improving content quality’ is not a goal. It is a sentiment. Before running a single Screaming Frog crawl, define what success looks like in measurable terms: increasing organic traffic to the blog by 20% within six months, moving five target pages from page two to page one, or reducing the number of indexed thin-content pages by 40%. A defined goal determines which data you prioritise, which decisions you make first, and how you evaluate whether the audit produced a return.
Using Only One Data Source
Some teams run their entire content audit from Search Console alone, making removal decisions based purely on traffic data without checking backlinks. Others use only a crawl tool and a word count threshold, removing anything under 500 words regardless of its link profile or engagement data. Both approaches produce decisions that look logical in isolation but frequently cause real damage. The value of the four-source approach described in this guide – crawl data, traffic data, engagement data, and backlink data – is that it reveals contradictions that a single data source will always miss.
Removing Content Too Aggressively
The pruning instinct that drives most content audits can easily overshoot. A page that receives 30 organic clicks per month looks negligible until you realise it is the only page on your site covering a specific long-tail query that your best customer segment searches for. Context matters enormously in content decisions, and the four-criteria test for removal exists precisely to prevent well-intentioned pruning from creating gaps in your topical coverage that take months to rebuild.
Other Critical Mistakes
- Auditing the entire site in one session rather than scoping to a single subdirectory or content category – the scope paralysis this creates is the primary reason first audits are never finished
- Failing to document the rationale for each decision – six months later, no one remembers why a redirect was set up or which page was the survivor in a consolidation, making the next audit significantly harder
- Skipping the post-audit monitoring phase – executing changes is only half the work; verifying that those changes produced the expected results is what makes the next audit faster and more confident
- Treating the audit as a one-time project rather than a recurring programme – content ages, search intent shifts, and new pages create new cannibalisation risks; build at least an annual review cycle into your content calendar from the start
Conclusion
A site with one hundred excellent pages will almost always outperform a site with five hundred mediocre ones. This is the central insight behind content auditing, and it is why the discipline has moved from an optional best practice to a core SEO activity for any team that has been publishing for more than a year.
The framework in this guide – crawl with Screaming Frog, pull traffic data from Search Console, add engagement data from GA4, check backlinks with Ahrefs Webmaster Tools, categorise every URL, and build a prioritised action plan – costs nothing to implement and produces results that rival those of teams spending hundreds of dollars per month on dedicated audit platforms. The tools are free. The process is repeatable. The only input required is structured time and a willingness to make decisions about content that may have taken significant effort to create.
Start small if the scale of the task feels daunting. Take your twenty lowest-traffic posts, run them through the process described here, and make one decision about each one. That is your first content audit. Schedule the next one for six months from now, and the one after that for six months beyond that. Each cycle will take less time than the last, because you will be maintaining a clean baseline rather than clearing years of accumulated quality debt.
The content you improve today will compound over the months and years ahead. Every page that moves from page two to page one, every thin post that becomes a comprehensive resource, every cannibalisation conflict that gets resolved – each of these is a compounding asset that earns traffic and authority without requiring a single new word to be written. That is the return on investment that makes a content audit the most underrated activity in content marketing.
Frequently Asked Questions
Q1. How long does a content audit take?
The time required depends almost entirely on the size of your site and whether this is your first audit or a repeat cycle. For a site with 50 to 100 URLs, using the free tool stack described in this guide, expect to spend four to eight hours spread across two working days – roughly two hours on data collection and merging, and the remainder on decision-making and action planning. For a site with 200 to 500 URLs, one to two working weeks is a realistic estimate. A site being audited for the second or third time takes a fraction of this time, because the previous audit sheet provides a baseline that dramatically reduces the categorisation work. The slowest part of any content audit is never the data collection – it is making thoughtful, consistent decisions about every URL, which cannot be rushed without sacrificing the quality of the resulting action plan.
Q2. What is the difference between a content audit and a content inventory?
A content inventory is the raw starting point: a complete list of all URLs on a site, typically including the page title, word count, publication date, and HTTP response code. It tells you what exists. A content audit adds a layer of evaluation and decision-making on top of that inventory. Each URL is assessed against traffic, engagement, and backlink data, and assigned a specific action – keep, update, consolidate, redirect, or remove. The inventory answers the question ‘what do we have?’ The audit answers the question ‘what should we do about it?’ Most content audit guides use the two terms interchangeably, but understanding the distinction helps you sequence the work correctly: the inventory always comes first, followed by the data-enrichment and decision-making phases.
Q3. Should I delete low-traffic content?
Low traffic alone is not a sufficient reason to delete a page, and treating it as one is the most common error in content auditing. Before making any removal decision, check four criteria: Does the page have any backlinks from relevant external sites? Does it rank for any keywords, even low-volume ones? Does it serve a user intent that no other page on your site covers? Does it contribute to your topical authority in a way that supports adjacent, higher-traffic pages? If the honest answer to all four questions is no, removal or consolidation is appropriate. If any answer is yes, update or redirect the page rather than deleting it. The risk of over-pruning – removing content that contributes to domain authority or topical coverage in ways that traffic data does not directly reflect – is real and can take months to reverse.
Q4. How often should I run a content audit?
The right cadence depends on how actively you are publishing. Sites that publish one to two pieces of content per month accumulate quality debt slowly, and an annual audit is sufficient to keep the archive in good shape. Sites publishing weekly or more frequently should run a lighter-touch review every six months alongside a full annual audit. Beyond these routine cycles, certain events should trigger an unscheduled audit regardless of timing: a sustained organic traffic drop of 15% or more, a confirmed Google algorithm update that affected sites in your niche, a planned site migration or domain change, or a significant pivot in your content strategy that has left a large portion of your existing archive misaligned with your current audience. The most important principle is consistency – an audit run annually on a predictable schedule is far more valuable than a perfect audit run once and never repeated.
Q5. Can I run a content audit without Google Search Console data?
Technically yes, but the quality of your decisions will be substantially reduced. Google Search Console provides first-party data – information that comes directly from Google – on how each individual page on your site is performing in search. No third-party tool, regardless of its cost, can match the accuracy of this data for your own domain. If Search Console was not previously configured for your site, install it immediately, verify your domain, and wait at least 90 days before running the audit. In the interim, you can still conduct a useful technical inventory using Screaming Frog and add engagement data from GA4, but decisions about which pages to update or remove should be deferred until you have reliable search performance data to inform them. Acting on incomplete data is worse than waiting.
Q6. What does keyword cannibalisation mean in a content audit?
Keyword cannibalisation occurs when two or more pages on the same website target the same primary search query, causing them to compete against each other for the same ranking position. Google must decide which of the competing pages best serves the query, and in most cases it cannot rank both – meaning neither page performs as strongly as a single, authoritative piece covering the topic comprehensively would. In a content audit, cannibalisation is identified by filtering Google Search Console data to find multiple URLs with significant impressions for the same query. The fix is consolidation: the stronger of the competing pages is designated the survivor, rewritten to incorporate the best content from all competing pages, and all other competing URLs are 301-redirected to it. Cannibalisation is one of the most common causes of inexplicably flat rankings on competitive topics, and resolving it is often one of the fastest wins available in any audit.
Q7. Is Screaming Frog really free, and what are its limits?
Screaming Frog SEO Spider is genuinely free for sites with up to 500 URLs. The free version provides full access to all core crawl features – URL discovery, response code identification, meta data extraction, word count, redirect chain analysis, duplicate content detection, and more. For the majority of SMB content libraries, 500 URLs is sufficient. Sites above this threshold require a paid annual licence, currently priced at approximately 259 dollars per year, which removes the crawl limit entirely and adds a range of additional features including scheduled crawls and Google Analytics integration. Teams who want to stay within the free tier on a larger site can work around the limit by crawling one subdirectory at a time – for example, crawling only the /blog/ subfolder in one session and the /resources/ subfolder in another. This approach requires a little more manual data management but costs nothing and produces the same output.