 {"id":519629,"date":"2025-11-29T20:49:00","date_gmt":"2025-11-30T03:49:00","guid":{"rendered":"https:\/\/jorgep.com\/blog\/?p=519629"},"modified":"2026-02-18T13:02:11","modified_gmt":"2026-02-18T20:02:11","slug":"building-the-ultimate-private-ai-lab","status":"publish","type":"post","link":"https:\/\/jorgep.com\/blog\/building-the-ultimate-private-ai-lab\/","title":{"rendered":"Building the Ultimate Private AI Lab"},"content":{"rendered":"<style>.wp-block-kadence-advancedheading.kt-adv-heading519190_4a1b6f-84, .wp-block-kadence-advancedheading.kt-adv-heading519190_4a1b6f-84[data-kb-block=\"kb-adv-heading519190_4a1b6f-84\"]{font-size:var(--global-kb-font-size-sm, 0.9rem);font-style:normal;}.wp-block-kadence-advancedheading.kt-adv-heading519190_4a1b6f-84 mark.kt-highlight, .wp-block-kadence-advancedheading.kt-adv-heading519190_4a1b6f-84[data-kb-block=\"kb-adv-heading519190_4a1b6f-84\"] mark.kt-highlight{font-style:normal;color:#f76a0c;-webkit-box-decoration-break:clone;box-decoration-break:clone;padding-top:0px;padding-right:0px;padding-bottom:0px;padding-left:0px;}.wp-block-kadence-advancedheading.kt-adv-heading519190_4a1b6f-84 img.kb-inline-image, .wp-block-kadence-advancedheading.kt-adv-heading519190_4a1b6f-84[data-kb-block=\"kb-adv-heading519190_4a1b6f-84\"] img.kb-inline-image{width:150px;vertical-align:baseline;}<\/style>\n<p class=\"kt-adv-heading519190_4a1b6f-84 wp-block-kadence-advancedheading\" data-kb-block=\"kb-adv-heading519190_4a1b6f-84\">AI Disclaimer I love exploring new technology, and that includes using AI to help with research and editing! My digital &#8220;team&#8221; includes tools like Google Gemini, Notebook LM, Microsoft Copilot, Perplexity.ai, Claude.ai, and others as needed. They help me gather insights and polish content\u2014so you get the best, most up-to-date information possible.<\/p>\n\n\n\n<div class=\"wp-block-columns has-theme-palette-7-background-color has-background is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<p>Part of: <strong> <a href=\"https:\/\/jorgep.com\/blog\/series-ai-learnings\/\">AI Learning Series Here<\/a><\/strong><\/p>\n\n\n<style>.kadence-column395113_43ef2d-d5 > .kt-inside-inner-col,.kadence-column395113_43ef2d-d5 > .kt-inside-inner-col:before{border-top-left-radius:0px;border-top-right-radius:0px;border-bottom-right-radius:0px;border-bottom-left-radius:0px;}.kadence-column395113_43ef2d-d5 > .kt-inside-inner-col{column-gap:var(--global-kb-gap-sm, 1rem);}.kadence-column395113_43ef2d-d5 > .kt-inside-inner-col{flex-direction:column;}.kadence-column395113_43ef2d-d5 > .kt-inside-inner-col > .aligncenter{width:100%;}.kadence-column395113_43ef2d-d5 > .kt-inside-inner-col:before{opacity:0.3;}.kadence-column395113_43ef2d-d5{position:relative;}@media all and (max-width: 1024px){.kadence-column395113_43ef2d-d5 > .kt-inside-inner-col{flex-direction:column;justify-content:center;}}@media all and (max-width: 767px){.kadence-column395113_43ef2d-d5 > .kt-inside-inner-col{flex-direction:column;justify-content:center;}}<\/style>\n<div class=\"wp-block-kadence-column kadence-column395113_43ef2d-d5\"><div class=\"kt-inside-inner-col\"><style>.wp-block-kadence-advancedheading.kt-adv-heading510545_6813a5-28, .wp-block-kadence-advancedheading.kt-adv-heading510545_6813a5-28[data-kb-block=\"kb-adv-heading510545_6813a5-28\"]{font-size:var(--global-kb-font-size-sm, 0.9rem);font-style:normal;}.wp-block-kadence-advancedheading.kt-adv-heading510545_6813a5-28 mark.kt-highlight, .wp-block-kadence-advancedheading.kt-adv-heading510545_6813a5-28[data-kb-block=\"kb-adv-heading510545_6813a5-28\"] mark.kt-highlight{font-style:normal;color:#f76a0c;-webkit-box-decoration-break:clone;box-decoration-break:clone;padding-top:0px;padding-right:0px;padding-bottom:0px;padding-left:0px;}.wp-block-kadence-advancedheading.kt-adv-heading510545_6813a5-28 img.kb-inline-image, .wp-block-kadence-advancedheading.kt-adv-heading510545_6813a5-28[data-kb-block=\"kb-adv-heading510545_6813a5-28\"] img.kb-inline-image{width:150px;vertical-align:baseline;}<\/style>\n<p class=\"kt-adv-heading510545_6813a5-28 wp-block-kadence-advancedheading\" data-kb-block=\"kb-adv-heading510545_6813a5-28\">Quick Links:&nbsp;<a href=\"https:\/\/jorgep.com\/blog\/resources-for-learning-ai\/\">Resources for Learning AI<\/a> | <a href=\"https:\/\/jorgep.com\/blog\/keeping-up-with-ai\/\">Keep up with AI<\/a> | <a href=\"https:\/\/jorgep.com\/blog\/list-of-ai-tools\/\" data-type=\"post\" data-id=\"402818\">List of AI Tools<\/a><\/p>\n<\/div><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><div class=\"wp-block-template-part\"><style>.wp-block-kadence-advancedheading.kt-adv-heading395113_c650df-47, .wp-block-kadence-advancedheading.kt-adv-heading395113_c650df-47[data-kb-block=\"kb-adv-heading395113_c650df-47\"]{text-align:center;font-size:var(--global-kb-font-size-md, 1.25rem);line-height:60px;font-style:normal;background-color:#f5a511;}.wp-block-kadence-advancedheading.kt-adv-heading395113_c650df-47 mark.kt-highlight, .wp-block-kadence-advancedheading.kt-adv-heading395113_c650df-47[data-kb-block=\"kb-adv-heading395113_c650df-47\"] mark.kt-highlight{font-style:normal;color:#f76a0c;-webkit-box-decoration-break:clone;box-decoration-break:clone;padding-top:0px;padding-right:0px;padding-bottom:0px;padding-left:0px;}.wp-block-kadence-advancedheading.kt-adv-heading395113_c650df-47 img.kb-inline-image, .wp-block-kadence-advancedheading.kt-adv-heading395113_c650df-47[data-kb-block=\"kb-adv-heading395113_c650df-47\"] img.kb-inline-image{width:150px;vertical-align:baseline;}<\/style>\n<p class=\"kt-adv-heading395113_c650df-47 wp-block-kadence-advancedheading\" data-kb-block=\"kb-adv-heading395113_c650df-47\">Subscribe to <a href=\"https:\/\/go.35s.be\/jtb\" target=\"_blank\" rel=\"noreferrer noopener\"><strong>JorgeTechBits  newsletter<\/strong><\/a><\/p>\n<\/div><\/div>\n<\/div>\n\n\n\n<p>As a follow up to my <a href=\"https:\/\/jorgep.com\/blog\/how-to-host-multiple-public-websites-on-your-windows-pc\/\" data-type=\"post\" data-id=\"519586\">How to Host Multiple Public Websites on Your Windows PC<\/a> a few months bac, I am writing this one <\/p>\n\n\n\n<p><\/p>\n\n\n\n<p>If you want to move away from expensive AI subscriptions and bring your data back home, building a local AI home lab is the answer. Today, I\u2019m documenting how I set up my new <strong>Ryzen AI 9 HX370<\/strong> server to act as a private AI powerhouse for both chat and automation.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">The Goal<\/h2>\n\n\n\n<p>A fully self-hosted AI stack where <strong>Ollama<\/strong> runs the models, <strong>Open WebUI<\/strong> provides the interface, and <strong>n8n<\/strong> handles complex automations\u2014all accessible from anywhere without a Virtual Private Server (VPS).  I call it my &#8220;Zero VPS&#8221; Strategy<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">1. The Hardware: Why the Ryzen AI 9?<\/h2>\n\n\n\n<p>We are using the <strong>Ryzen AI 9 HX370<\/strong>. While it has a powerful Radeon 890M iGPU, getting ROCm (AMD\u2019s GPU software) to play nice in Docker on Windows is currently &#8220;bleeding edge.&#8221; For maximum stability, we chose to run the stack in <strong>Docker on Windows (WSL2)<\/strong> using CPU optimization and the fast AVX-512 instruction set.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p>My Hardware Specs: <\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>MINISFORUM A1 Z1 Pro-370 Mini PC<\/li>\n\n\n\n<li><a href=\"https:\/\/www.amd.com\/en\/products\/processors\/laptop\/ryzen\/ai-300-series\/amd-ryzen-ai-9-hx-370.html\">AMD Ryzen AI 9 HX370<\/a><\/li>\n\n\n\n<li>GPU: AMD Radeon 890M Copilot AIPC<\/li>\n\n\n\n<li>12 Cores \/ 24 threads 80 Tops 5.1GHX (Zen 5 Architecture) , 2TB HD<\/li>\n\n\n\n<li>64GB DDR5, WiFi7, 2 USB4, 2 RJ45 Bluetooth 5.4, <\/li>\n\n\n\n<li>NPU, CPU CGPU<\/li>\n\n\n\n<li>Window 11 Pro<\/li>\n<\/ul>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>The Result:<\/strong> You now have a production-grade AI lab accessible from anywhere that costs $0\/month in subscriptions. Whether you are chatting with locally installed LLMs (Llama 3.2, Qwen) or the many (hundresds) of the ones available via OpenRouter, the control is entirely in your hands\u2014all of which can be used from a local n8n installation (no VPS required!).<\/p>\n<\/blockquote>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">2. The &#8220;Perfect&#8221; Docker-Compose File<\/h2>\n\n\n\n<p>This configuration ensures all three services talk to each other internally while exposing the right ports for external access from your laptop or phone.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><code>docker-compose.yml<\/code><\/h3>\n\n\n\n<p>YAML<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>services:\n  ollama:\n    image: ollama\/ollama\n    container_name: ollama\n    volumes:\n      - ollama_data:\/root\/.ollama\n    ports:\n      - \"11434:11434\" # Allows access from your laptop\n    environment:\n      - OLLAMA_HOST=0.0.0.0 # Tells Ollama to listen to the network\n      - OLLAMA_KEEP_ALIVE=-1 # Keeps models in RAM for instant response\n    networks:\n      - ai-network\n    restart: unless-stopped\n\n  open-webui:\n    image: ghcr.io\/open-webui\/open-webui:main\n    container_name: open-webui\n    ports:\n      - \"3000:8080\"\n    environment:\n      - OLLAMA_BASE_URL=http:\/\/ollama:11434\n    volumes:\n      - open_webui_data:\/app\/backend\/data\n    networks:\n      - ai-network\n    restart: unless-stopped\n\n  n8n:\n    image: n8nio\/n8n:latest\n    container_name: n8n\n    ports:\n      - \"5678:5678\"\n    environment:\n      - N8N_HOST=192.168.12.88 # Replace with your Ryzen Server IP\n      - WEBHOOK_URL=http:\/\/192.168.12.88:5678\/ # Replace with your Ryzen Server IP\n      - OLLAMA_HOST=http:\/\/ollama:11434\n      - N8N_SECURE_COOKIE=false # Required for non-HTTPS local access\n    volumes:\n      - n8n_data:\/home\/node\/.n8n\n    networks:\n      - ai-network\n    restart: unless-stopped\n\nnetworks:\n  ai-network:\n\nvolumes:\n  ollama_data:\n  n8n_data:\n  open_webui_data:\n<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">d. Scaling with OpenRouter (Hybrid AI)<\/h2>\n\n\n\n<p>For tasks too heavy for local hardware (like Claude 3.5 Sonnet), we integrated <strong>OpenRouter<\/strong>.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>In Open WebUI:<\/strong> Go to Settings &gt; Admin &gt; Connections and add <code>https:\/\/openrouter.ai\/api\/v1<\/code> with your API key.<\/li>\n\n\n\n<li><strong>In n8n:<\/strong> Use the &#8220;OpenRouter&#8221; node to access high-intelligence models when your local Llama needs a &#8220;second opinion.&#8221;<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">4. Performance &amp; Benchmarks<\/h2>\n\n\n\n<p>One of the strongest arguments for this &#8220;Hybrid&#8221; approach is the balance between local privacy and cloud power. While local models are &#8220;free&#8221; to run once you own the hardware, cloud models via OpenRouter offer incredible speed and zero impact on your Ryzen server\u2019s storage.<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><td><strong>Model Source<\/strong><\/td><td><strong>Model Name<\/strong><\/td><td><strong>Local Storage<\/strong><\/td><td><strong>Expected Speed (TPS)<\/strong><\/td><td><strong>Best Use Case<\/strong><\/td><\/tr><\/thead><tbody><tr><td><strong>Local (Ollama)<\/strong><\/td><td>Llama 3.2 1B<\/td><td>~1.3 GB<\/td><td><strong>45-50 t\/s<\/strong><\/td><td>Instant classification &amp; routing.<\/td><\/tr><tr><td><strong>Local (Ollama)<\/strong><\/td><td>Llama 3.2 3B<\/td><td>~2.0 GB<\/td><td><strong>25-30 t\/s<\/strong><\/td><td>Private drafting &amp; daily assistance.<\/td><\/tr><tr><td><strong>Local (Ollama)<\/strong><\/td><td>Qwen 2.5 14B<\/td><td>~9.0 GB<\/td><td><strong>8-12 t\/s<\/strong><\/td><td>Deep logic &amp; local coding help.<\/td><\/tr><tr><td><strong>Cloud (OpenRouter)<\/strong><\/td><td><strong>GPT-4o<\/strong><\/td><td><strong>0 GB<\/strong><\/td><td><strong>60-80 t\/s<\/strong><\/td><td>High-speed, high-intelligence tasks.<\/td><\/tr><tr><td><strong>Cloud (OpenRouter)<\/strong><\/td><td><strong>Claude 3.5 Sonnet<\/strong><\/td><td><strong>0 GB<\/strong><\/td><td><strong>50-70 t\/s<\/strong><\/td><td>Advanced coding &amp; complex reasoning.<\/td><\/tr><tr><td><strong>Cloud (OpenRouter)<\/strong><\/td><td><strong>Gemini 2.5 Flash<\/strong><\/td><td><strong>0 GB<\/strong><\/td><td><strong>200+ t\/s<\/strong><\/td><td>Massive context &amp; ultra-fast bursts.<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Key Takeaway<\/strong>:<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Local Storage:<\/strong> OpenRouter models require <strong>0 GB of local storage<\/strong>. This is perfect for when you want to use a massive 400B parameter model that would never fit on a standard SSD.<\/li>\n\n\n\n<li><strong>Throughput (TPS):<\/strong> While the Ryzen AI 9 is impressively fast, cloud providers use massive GPU clusters. Using <strong>Gemini 2.5 Flash<\/strong> via OpenRouter can give you speeds that feel like the text is appearing all at once, which is a game-changer for long-form content generation in n8n.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">5. Security &amp; Remote Access: The Tailscale Way<\/h2>\n\n\n\n<p>Instead of opening risky ports on your router, we used <strong>Tailscale<\/strong>.<\/p>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li><strong>Install Tailscale<\/strong> on the Ryzen Server and your Laptop.<\/li>\n\n\n\n<li><strong>Log in<\/strong> with the same account.<\/li>\n\n\n\n<li>Install the TailScale App <\/li>\n\n\n\n<li>Your server gets a private IP (<strong>100.x.x.x<\/strong>). You can now access your n8n or Open WebUI from a coffee shop exactly as if you were home.<\/li>\n<\/ol>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Summary<\/h2>\n\n\n\n<p>You\u2019ve just built a enterprise-grade AI infrastructure for the price of a single high-end PC. By leveraging <strong>n8n<\/strong> as the orchestrator, you can build automations that are private, cost $0 in monthly fees, and scale infinitely.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>The &#8220;Zero Storage&#8221; Strategy<\/strong><\/h3>\n\n\n\n<p>You can start with <strong>0 GB of models<\/strong> and just use OpenRouter for everything while they learn the ropes. Then, as you get comfortable, they can &#8220;pull&#8221; local models like Llama 3.2, Qwen or whatever is available, to save money on repetitive tasks.<\/p>\n\n\n\n<p>This makes the &#8220;barrier to entry&#8221; for a home lab practically zero!<\/p>\n\n\n\n<div style=\"font-family: Verdana, Geneva, sans-serif; font-size: 11px;\"><b>Disclaimer<\/b>: \u00a0I work for <a href=\"https:\/\/www.dell.com\/en-us\/work\/learn\/by-service-type-deployment\">Dell Technology Services<\/a> as a Workforce Transformation Solutions Principal.\u00a0 \u00a0 It is my passion to help guide organizations\u00a0through the current technology transition specifically as it relates to <a href=\"https:\/\/www.delltechnologies.com\/en-us\/what-we-do\/workforce-transformation.htm\">Workforce Transformation<\/a>.\u00a0 Visit <a href=\"https:\/\/www.delltechnologies.com\/en-us\/index.htm\">Dell Technologies<\/a>\u00a0site for more information.\u00a0 Opinions are my own and not the views of my employer.<\/div>\n<div><\/div><br>\n","protected":false},"excerpt":{"rendered":"<p>As a follow up to my How to Host Multiple Public Websites on Your Windows PC a few months bac, I am writing this one If you want to move away from expensive AI subscriptions and bring your data back home, building a local AI home lab is the answer. Today, I\u2019m documenting how I&#8230;<\/p>\n","protected":false},"author":2,"featured_media":519682,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_kad_blocks_custom_css":"","_kad_blocks_head_custom_js":"","_kad_blocks_body_custom_js":"","_kad_blocks_footer_custom_js":"","ngg_post_thumbnail":0,"episode_type":"","audio_file":"","podmotor_file_id":"","podmotor_episode_id":"","cover_image":"","cover_image_id":"","duration":"","filesize":"","filesize_raw":"","date_recorded":"","explicit":"","block":"","itunes_episode_number":"","itunes_title":"","itunes_season_number":"","itunes_episode_type":"","_kad_post_transparent":"","_kad_post_title":"","_kad_post_layout":"","_kad_post_sidebar_id":"","_kad_post_content_style":"","_kad_post_vertical_padding":"","_kad_post_feature":"","_kad_post_feature_position":"","_kad_post_header":false,"_kad_post_footer":false,"_kad_post_classname":"","footnotes":""},"categories":[681,441],"tags":[471,930,917,919,986,742,1012,1013],"class_list":["post-519629","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-moderneuc2","category-tech-talk","tag-ai","tag-ai-series","tag-containers","tag-docker","tag-local-ai","tag-moderneuc1","tag-self-hosting","tag-zero-vps-strategy"],"taxonomy_info":{"category":[{"value":681,"label":"ModernEUC"},{"value":441,"label":"Tech Talk"}],"post_tag":[{"value":471,"label":"AI"},{"value":930,"label":"AI Series"},{"value":917,"label":"Containers"},{"value":919,"label":"Docker"},{"value":986,"label":"Local AI"},{"value":742,"label":"ModernEUC"},{"value":1012,"label":"self hosting"},{"value":1013,"label":"Zero VPS Strategy"}]},"featured_image_src_large":["https:\/\/jorgep.com\/blog\/wp-content\/uploads\/FeaturedImage-BUildingMyPrivateAILab-1024x512-mod01.png",1024,512,false],"author_info":{"display_name":"Jorge Pereira","author_link":"https:\/\/jorgep.com\/blog\/author\/jorge\/"},"comment_info":0,"category_info":[{"term_id":681,"name":"ModernEUC","slug":"moderneuc2","term_group":0,"term_taxonomy_id":691,"taxonomy":"category","description":"","parent":0,"count":261,"filter":"raw","cat_ID":681,"category_count":261,"category_description":"","cat_name":"ModernEUC","category_nicename":"moderneuc2","category_parent":0},{"term_id":441,"name":"Tech Talk","slug":"tech-talk","term_group":0,"term_taxonomy_id":451,"taxonomy":"category","description":"","parent":0,"count":670,"filter":"raw","cat_ID":441,"category_count":670,"category_description":"","cat_name":"Tech Talk","category_nicename":"tech-talk","category_parent":0}],"tag_info":[{"term_id":471,"name":"AI","slug":"ai","term_group":0,"term_taxonomy_id":481,"taxonomy":"post_tag","description":"","parent":0,"count":141,"filter":"raw"},{"term_id":930,"name":"AI Series","slug":"ai-series","term_group":0,"term_taxonomy_id":940,"taxonomy":"post_tag","description":"","parent":0,"count":144,"filter":"raw"},{"term_id":917,"name":"Containers","slug":"containers","term_group":0,"term_taxonomy_id":927,"taxonomy":"post_tag","description":"","parent":0,"count":6,"filter":"raw"},{"term_id":919,"name":"Docker","slug":"docker","term_group":0,"term_taxonomy_id":929,"taxonomy":"post_tag","description":"","parent":0,"count":9,"filter":"raw"},{"term_id":986,"name":"Local AI","slug":"local-ai","term_group":0,"term_taxonomy_id":996,"taxonomy":"post_tag","description":"","parent":0,"count":23,"filter":"raw"},{"term_id":742,"name":"ModernEUC","slug":"moderneuc1","term_group":0,"term_taxonomy_id":752,"taxonomy":"post_tag","description":"","parent":0,"count":284,"filter":"raw"},{"term_id":1012,"name":"self hosting","slug":"self-hosting","term_group":0,"term_taxonomy_id":1022,"taxonomy":"post_tag","description":"","parent":0,"count":3,"filter":"raw"},{"term_id":1013,"name":"Zero VPS Strategy","slug":"zero-vps-strategy","term_group":0,"term_taxonomy_id":1023,"taxonomy":"post_tag","description":"","parent":0,"count":2,"filter":"raw"}],"_links":{"self":[{"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/posts\/519629","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/comments?post=519629"}],"version-history":[{"count":5,"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/posts\/519629\/revisions"}],"predecessor-version":[{"id":519683,"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/posts\/519629\/revisions\/519683"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/media\/519682"}],"wp:attachment":[{"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/media?parent=519629"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/categories?post=519629"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/jorgep.com\/blog\/wp-json\/wp\/v2\/tags?post=519629"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}