Nuxt Content Migration: Boost Performance & Simplify Structure

by Felix Dubois 63 views

Hey guys! We've got a big plan to migrate our current setup to a cleaner, more efficient Nuxt Content structure. This isn't just about tidying up; it's about making our site faster, easier to maintain, and overall, a better experience for everyone. So, let's dive into the details!

Overview

Our main goal here is to move away from our complex nested folder structure to a more streamlined, Nuxt Content-friendly setup. This will help us ditch those pesky build issues and seriously boost performance. Think of it as going from a cluttered attic to a well-organized, minimalist space. Our current structure is causing headaches, and we're aiming for a target structure that's much easier to work with.

Current vs Target Structure

Current (Problematic):

Our current setup is a bit of a maze. Imagine folders within folders, making it tough to find what you need and slowing down our build times. It's like trying to navigate a city without street signs – frustrating and inefficient.

Target (Nuxt-Friendly):

Our target structure is all about simplicity. We're talking clean, flat hierarchies that Nuxt Content loves. This means faster builds, easier content management, and a much happier development team. It's like moving to a well-planned neighborhood where everything is just a short walk away.

Step 1: Create Migration Scripts

Issue: Creating PowerShell Migration Scripts

Alright, first things first, we need to create some trusty PowerShell scripts to handle the heavy lifting. These scripts will automate the process of moving and reorganizing our content, so we don't have to do it all by hand. Think of them as our little helpers, ensuring everything gets moved correctly and efficiently.

Acceptance Criteria:

  • We need a scripts/migrate-libraries.ps1 script to flatten the library structure. This script will take our currently nested library folders and bring them all to the same level, making it easier to access and manage each library's content. The goal is to transform a deeply nested structure into a flat list, kind of like alphabetizing a messy bookshelf.
  • Next up, scripts/migrate-logbooks.ps1 will reorganize our logbook entries. We're talking about moving entries from their current scattered locations into a more organized system. This script will ensure our logbooks are easily searchable and logically structured. Imagine taking a pile of old journals and organizing them chronologically and by subject.
  • We also need scripts/migrate-images.ps1 to move images to the public folder. This is crucial for performance, as it allows Nuxt to serve images more efficiently. Think of it as moving your photos from a filing cabinet to a display case where they’re easily accessible and look great. Image optimization is key for a fast-loading site.
  • Finally, scripts/validate-migration.ps1 will verify our data integrity. This script is our safety net, ensuring that nothing gets lost or corrupted during the migration. It’s like having a checklist to make sure all the boxes are ticked. We need to ensure every piece of content is correctly migrated.
  • All scripts should work seamlessly on Windows PowerShell, our primary environment for these tasks. This ensures everyone on the team can run the scripts without compatibility issues. It's about making the process smooth and accessible for everyone involved.
  • Each script will include a dry-run mode for testing. This lets us see what the script would do without actually making any changes. It's like a practice run before the big performance. We can identify and fix any potential issues before they become real problems.

Technical Details:

We'll use PowerShell because it's powerful and widely supported in our environment. The dry-run mode will be implemented using conditional logic within the scripts, allowing us to simulate the migration without altering any files. Each script will be designed to handle errors gracefully, providing informative messages if something goes wrong. Error handling is crucial for a smooth and reliable migration.

Step 2: Define New Content Schema

Issue: Defining Standardized Frontmatter Schema for Libraries

Now, let's talk about structure. We need a solid, standardized frontmatter schema for our libraries. This is like creating a blueprint for our content, ensuring consistency and making it easier to work with. A well-defined schema helps us manage our data effectively and prevents headaches down the road. This is all about laying a solid foundation.

Acceptance Criteria:

  • We'll create a TypeScript interface for our library frontmatter. TypeScript will help us enforce the schema, catching errors early and ensuring our data is consistent. It’s like having a grammar checker for our content. The interface will define the structure of our frontmatter.
  • This interface will clearly define required and optional fields. Knowing what's mandatory and what's optional helps us ensure we have all the necessary information for each library. It's like having a checklist for each entry. We need to specify exactly what information each library entry should contain.
  • We'll also establish naming conventions for IDs and slugs. Consistent naming is crucial for organization and avoids confusion. Think of it as giving each library a unique and easily recognizable name. Clear naming conventions make everything easier to find and reference.
  • Photo path conventions will be documented to ensure consistency in how we reference images. This helps us avoid broken links and ensures our images are displayed correctly. It's like creating a standard way to label and store our photos. Consistency in image paths is essential for a polished website.
  • A validation schema will be created to automatically check that our frontmatter conforms to the defined interface. This is an extra layer of protection against errors, ensuring our data is always valid. It's like having a quality control inspector. Automated validation helps maintain data integrity.

Schema Definition:

The schema will include fields like id, title, slug, description, tags, location, photo, and more. We'll use TypeScript's type system to define the data types for each field (e.g., string, number, array). The validation schema will likely use a library like Yup or Zod to provide runtime validation. The goal is a robust and well-defined content structure.

Step 3: Execute Content Migration

Issue: Migrating Library Content Files to New Structure

Time to put our scripts to work! We're going to migrate our library content files to the new structure. This is where the magic happens, transforming our messy content into a beautifully organized system. Think of it as moving from a chaotic storage room to a well-organized library.

Acceptance Criteria:

  • We'll run the migration script on all library folders. This ensures that every piece of content is moved and reorganized according to our new structure. It’s like sweeping through every corner of the room and putting everything in its place. The script needs to process all our library content.
  • index.md files will be converted to individual library files. This simplifies our structure and makes it easier to manage individual libraries. It’s like turning a single, bulky book into a series of easily digestible chapters. Each library will have its own dedicated file.
  • Frontmatter data will be extracted and standardized. This is crucial for consistency and ensures our data is in the correct format. It’s like taking notes from various sources and organizing them into a single, coherent document. Standardized frontmatter makes our content more manageable.
  • We'll preserve all content and markdown formatting. We want to make sure nothing gets lost in translation. It's like carefully packing your belongings so they arrive in perfect condition. Preserving formatting is essential for maintaining content quality.
  • A backup of the original structure will be generated. This is our safety net, allowing us to revert to the old structure if anything goes wrong. It’s like having a safety net during a high-wire act. Backups are crucial for data safety.
  • We'll validate that all files have been migrated successfully. This is our final check to ensure everything has been moved correctly. It’s like double-checking your luggage before leaving for the airport. Validation ensures a successful migration.

Commands:

The migration script will likely be run from the command line using a PowerShell command like .\[scripts/migrate-libraries.ps1](vscode-file://vscode-app/c:/Users/funvill/Documents/dev/neighbourhood-book-exchanges/scripts/migrate-libraries.ps1) -dryRun:$false. We'll use verbose logging to track the progress of the migration and identify any errors. The script will also generate a report summarizing the migration results. Detailed logging is essential for debugging.

Step 4: Migrate Logbook Entries

Issue: Reorganizing Logbook Entries by Library ID

Next up, we're tackling logbook entries. We need to reorganize them so they're easily accessible and linked to the correct library. This is like sorting through a pile of receipts and filing them by expense category. Organized logbooks make it easier to track activity and history.

Acceptance Criteria:

  • We'll move logbook entries from content/libraries/*/logbook/ to content/logbooks/*/. This centralizes our logbook entries, making them easier to find and manage. It's like moving all the files from scattered folders into a single, well-organized directory. Centralization improves accessibility.
  • We'll maintain chronological organization within each logbook. This ensures that entries are displayed in the correct order, making it easier to follow the history of each library. It’s like arranging journal entries in date order. Chronological order helps with understanding the timeline.
  • Logbook frontmatter will be standardized to ensure consistency. This is similar to what we did with the library frontmatter, creating a consistent structure for our data. Standardized frontmatter makes our logbook entries more uniform and manageable.
  • We need to ensure that library_id references are correct. This is crucial for linking logbook entries to the correct library. It’s like making sure the labels on your files are accurate. Correct references are essential for data integrity.
  • Finally, we'll validate that all logbook entries have been migrated successfully. This is our final check to ensure no entries have been lost or corrupted. It’s like counting all the receipts to make sure none are missing. Validation ensures completeness.

Target Structure:

The target structure will be content/logbooks/[library_id]/[logbook_entry_file].md. This flat structure makes it easy to query and retrieve logbook entries for a specific library. It's like having a dedicated folder for each project. A clear structure simplifies data access.

Step 5: Migrate Images to Public Folder

Issue: Moving Library Images to Public/images Structure

Let's talk images! We're moving our library images to the public folder to improve performance and simplify access. This is like moving your photos from a private album to a public gallery. Public assets are served more efficiently.

Acceptance Criteria:

  • We'll move all images from content/libraries/*/photos/ to public/images/libraries/*/. This ensures that images are served directly by the web server, improving load times. It’s like having a direct delivery line for your images. Public folder access is faster.
  • We'll maintain folder organization by library ID within the public folder. This keeps our images organized and makes it easier to manage them. It’s like having a separate folder for each photo album. Organized images are easier to manage.
  • All image references in content files will be updated to reflect the new paths. This ensures that images are displayed correctly after the migration. It’s like updating the address book when someone moves. Correct references are crucial for display.
  • We'll optimize image file names by removing spaces and using lowercase. This improves compatibility and avoids potential issues with web servers. It’s like giving your files a clean and professional label. Optimized filenames improve compatibility.
  • We'll validate that all images are accessible via HTTP. This ensures that our images are being served correctly and can be displayed on the website. It’s like making sure your photos are visible in the gallery. Accessibility is key.

Path Mapping:

For example, an image located at content/libraries/123/photos/image.jpg will be moved to public/images/libraries/123/image.jpg. We'll update the image references in the markdown files accordingly. Consistent path mapping simplifies maintenance.

Step 6: Update Vue Components

Issue: Refactoring Components for New Content Structure

Now, let's dive into our Vue components. We need to refactor them to work with our new content structure. This is like renovating your house to match the new layout of your furniture. Updated components ensure compatibility.

Acceptance Criteria:

  • We'll update the [[[slug]].vue](http://vscodecontentref/0) component to handle the new structure. This is our main component for displaying library pages, so it's crucial that it works correctly. It’s like the front door of our library. The slug component is key.
  • Data loading will be simplified using queryContent('/libraries').where({id: libraryId}). This new query pattern makes it easier to fetch library data. It’s like having a streamlined search function. Simplified queries improve efficiency.
  • Logbook loading will be updated to use queryContent('/logbooks/' + libraryId). This ensures that logbook entries are loaded correctly for each library. It’s like having a dedicated section for each journal. Correct logbook loading is essential.
  • We'll fix image path resolution for the public folder. This ensures that images are displayed correctly in our components. It’s like making sure the picture frames are hung in the right places. Accurate image paths are crucial.
  • Complex folder-based queries will be removed. This simplifies our components and improves performance. It’s like decluttering your living space. Simpler queries improve performance.
  • We'll test that all library pages load correctly. This is our final check to ensure that our components are working as expected. It’s like taking a tour of your renovated house. Thorough testing ensures functionality.

New Query Pattern:

Instead of relying on complex path parsing, we'll use Nuxt Content's queryContent API to fetch data based on IDs and slugs. This makes our components more robust and easier to maintain. A clear query pattern is key.

Step 7: Update Composables

Issue: Simplifying useLibraries Composable for New Structure

Let's simplify our composables, specifically useLibraries.ts. This will make our code cleaner and easier to understand. Think of it as organizing your toolbox so you can find the right tool quickly.

Acceptance Criteria:

  • We'll update [useLibraries.ts](vscode-file://vscode-app/c:/Users/funvill/AppData/Local/Programs/Microsoft%20VS%20Code/resources/app/out/vs/code/electron-browser/workbench/workbench.html) to use the flat library structure. This ensures that our composable is working with the new data organization. It's like updating your map to reflect the new roads.
  • Complex path parsing logic will be removed. This simplifies our composable and makes it easier to maintain. It’s like taking a shortcut instead of a winding road. Simpler logic improves maintainability.
  • Search and filtering will be simplified. This makes it easier to find libraries based on different criteria. It’s like having a more efficient search engine. Simplified search improves usability.
  • Image resolution helpers will be updated to work with the new image paths. This ensures that images are displayed correctly in our components. It’s like calibrating your camera for the new lighting. Accurate image resolution is essential.
  • We'll ensure backwards compatibility during the transition. This allows us to migrate our components gradually without breaking existing functionality. It’s like having a temporary bridge during construction. Backwards compatibility ensures a smooth transition.
  • Error handling will be added for missing data. This makes our composable more robust and prevents crashes. It’s like having a safety net for your code. Robust error handling is crucial.

Step 8: Update Search and Navigation

Issue: Updating Search and Navigation Components

Time to update our search and navigation components. This ensures that users can easily find what they're looking for on our site. Think of it as updating the street signs in your neighborhood.

Acceptance Criteria:

  • We'll update [SearchComponent.vue](vscode-file://vscode-app/c:/Users/funvill/AppData/Local/Programs/Microsoft%20VS%20Code/resources/app/out/vs/code/electron-browser/workbench/workbench.html) to use the new query patterns. This ensures that our search component is working with the new data structure. It’s like upgrading your search engine’s algorithm.
  • Tag filtering and search functionality will be fixed. This ensures that users can easily find libraries based on tags and keywords. It’s like having a more accurate filter in your photo album. Accurate search improves usability.
  • The map component will be updated for the new coordinate format. This ensures that library locations are displayed correctly on the map. It’s like calibrating your GPS for the new map. Correct coordinates are crucial for location accuracy.
  • We'll test search performance improvements. This ensures that our search component is fast and efficient. It’s like running a speed test on your internet connection. Performance improvements enhance user experience.
  • We'll ensure that all navigation links work correctly. This is crucial for site usability, ensuring that users can easily navigate between pages. It’s like making sure all the doors in your house open smoothly. Working navigation links are essential.

Step 9: Performance Testing

Issue: Validating Build Performance Improvements

Let's talk performance! We need to validate that our migration has actually improved build times. This is like checking the speedometer after tuning up your car.

Acceptance Criteria:

  • We'll measure npm run generate time before and after the migration. This gives us a clear picture of the performance improvements. It’s like timing a race before and after training. Accurate measurements are key.
  • We need to ensure that build time is under 5 seconds per library page (currently ~30 seconds). This is our target performance goal. It’s like setting a personal best for your running time. A faster build time is crucial.
  • Dev server startup time will be tested. A fast startup time improves developer productivity. It’s like having a quick boot-up time for your computer. Faster startup improves workflow.
  • We'll validate that all 1000+ libraries build successfully. This ensures that our migration has not introduced any errors that prevent building. It’s like making sure all the pieces of a puzzle fit together. Successful builds ensure site integrity.
  • Performance improvements will be documented. This helps us track our progress and provides a reference for future optimizations. It’s like keeping a log of your workouts. Documentation helps track progress.

Success Metrics:

  • Build time: < 5 seconds per library page (currently ~30 seconds)
  • Total generation time: < 30 minutes for all libraries
  • Dev server hot reload: < 2 seconds

Step 10: Cleanup and Documentation

Issue: Cleaning Up Old Structure and Documenting New Organization

Time to tidy up and document our new setup. This ensures that our project is clean and well-documented for future maintenance. Think of it as organizing your workspace after completing a project.

Acceptance Criteria:

  • We'll remove the old content/libraries/*/ folder structure. This ensures that we're not carrying around unnecessary files. It’s like decluttering your hard drive. Removing old files keeps things tidy.
  • The README will be updated with the new data organization. This provides a clear explanation of our new structure for anyone working on the project. It’s like updating the user manual for your software. Clear documentation is essential.
  • Content authoring guidelines will be documented. This ensures that everyone is following the same standards when creating content. It’s like having a style guide for your writing. Consistent guidelines ensure quality.
  • Deployment scripts will be updated if needed. This ensures that our deployment process is working with the new structure. It’s like updating the instructions for assembling a product. Smooth deployment is crucial.
  • Data validation CI checks will be created. This ensures that our data is always valid and consistent. It’s like having an automated quality control system. CI checks ensure data integrity.
  • Migration scripts and backups will be archived. This keeps our project tidy and provides a record of our migration process. It’s like putting old documents in storage. Archiving keeps things organized.

Rollback Plan

Issue: Creating Rollback Procedure for Migration

Just in case things go south, we need a solid rollback plan. This ensures that we can revert to our old structure if necessary. Think of it as having an emergency exit.

Acceptance Criteria:

  • Rollback steps will be documented. This provides a clear guide for reverting the migration. It’s like having a step-by-step instruction manual. Clear steps are essential.
  • The rollback procedure will be tested. This ensures that our rollback plan actually works. It’s like running a fire drill. Testing ensures preparedness.
  • Backup integrity will be ensured. This guarantees that our backups are valid and can be used to restore our data. It’s like checking the expiration date on your emergency supplies. Valid backups are crucial.
  • A rollback validation script will be created. This verifies that the rollback has been successful. It’s like having a checklist to confirm everything is back in place. Validation ensures success.

Priority Order

To make sure we tackle this migration effectively, we've got a priority order:

  • Step 1-2: Create scripts and schema (Low risk, high preparation value) - This is all about laying the groundwork and getting our tools ready.
  • Step 3-5: Execute migration (Medium risk, core changes) - This is where we actually move the content, so it's a bit riskier but super important.
  • Step 6-8: Update components (High risk, functionality changes) - This involves changing our code, so it's the riskiest part. It’s crucial to test thoroughly.
  • Step 9-10: Testing and cleanup (Low risk, validation) - This is about making sure everything works and tidying up.

So there you have it, guys! Our comprehensive migration plan to restructure to Nuxt Content. It's a big job, but with careful planning and execution, we'll end up with a faster, more efficient, and easier-to-manage website. Let's get to it!