About 3 weeks earlier I started writing blogs on Dev.to but I also had a portfolio with blogs, obviously I would like to maintain blogs at both places, for this I started manually, I wrote blogs on my portfolio and copied the same blog here dev.to. Even though I don't write more than 2 blogs a week it was manual work and was little tiring to do so, I am lazy. On top of it if there is changes in one blog I have to replicate it on both places, which can be easily missed. I automated it with 1 click! even it can be fully automated with CRON jobs on Lambda/workers or on any existing server!
Requirement / User stories
- I write 1-3 blogs per week.
- Maximum update I make on average is 4 per blog within span of 24 hours after that there are rarely any updates.
- Even if one of the platform dev.to or my portfolio is few hours behind it's alright for me.
Solutions
RSS feed
So there is a wonderful features Publishing to DEV Community from RSS by dev.to can be accessed in settings
I just to push my blogs to a RSS feed and let dev.to fetch them from there.
Major points to noted!
- Formatting will typically look good, but you may have to make manual fixes. In the case of Medium, you may have to manually fix embeds.
- Updates are periodically fetched (CRON jobs!)
- Blog will as draft in my dashboard I have to manually publish it.
The first pain point for me is I have to create RSS feeds, it might be easy but currently I have no idea how to do it then there are manual steps which I highlighted above.
But after getting to know about this hook, I though what if dev.to provides open API's?
Voila there are
Dev API's
A simple Google search landed me to API documentation
With API's I don't even have to worry about updating my blog!
What API do we need?
- Listing: to get the list
- Content: to get the blog content
Both are available that too with very useful filters
I am using https://dev.to/api/articles?username=mukuljainx
to fetch the list of blogs
and
https://dev.to/api/articles/mukuljainx/useeffect-the-basics-and-the-secrets-4ehg
to fetch a particular article/blog
The listing API's provide me a lot things for an article, there is more, listing the important ones
{
type_of: string;
id: number;
title: string;
description: string;
readable_publish_date: string;
// unique url endpoint for that article
slug: string;
path: string;
url: string;
comments_count: number;
public_reactions_count: number;
collection_id: number;
published_timestamp: Date;
positive_reactions_count: number;
cover_image: string;
user: {...};
}
For listing I needed title, time, reading time, description, slug (to keep the article url same as here in portfolio as it's unique for a user).
!
Content API
It provides the information about the article above as well as rendered HTML and raw markdown, using the rendered HTML I can directly show any blog in my portfolio I just have to append it, problem with approach is theme will differ, every portfolio have it's own theme! so do mine, for example I am using different CSS to represent code blocks, different colors for links.
Either I can overwrite the CSS but class names might change in future or the HTML arrangement, the only things which gonna stay same is raw markdown and we will use it!
To use the raw markdown we need a parser which can convert it into HTML in run-time, I already had one parser. I am using Gatsby for my portfolio and their official gatsby-transformer-remark
plugin to convert my MD files to HTML one and using CSS to prettify it as needed.
A code block looks like
Either I have to change my parser as gatsby-transformer-remark
parses MD files not raw MD in runtime (it might be, I didn't explored much) or use any other library for the same.
But if you remember the requirements, I might not need live updates, I don't mind if my portfolio blogs are updated after 10 hours.
I already have a setup Gatsby plugins are already parsing Markdown files and creating a static site, so it will be fast on any system and even with lower internet speed, Gatsby do a lot of enhancement to achieve this you can read more here
What if I can generate temporary MD files on top of content fetched from dev.to just before the build, in the same folder from where the plugin was already picking. For listing I can source the data from listing API to GraphQL (Gatsby uses it as default datasource)
This can be achieved using the gatsby-node.js
Gatsby Nodejs
Don't know Nodejs? not a problem at least for this part, as it's an enviorment with bunch of awesome API's but still is JS. We will be using 2 node API fs
and path
one to resolve path and another to write files on that path! That's it.
There is a hook onPreInit
which runs before all plugin and return a promise from it so Gatsby can wait before proceeding further.
I used axios
to make API calls and fs
to write files,
- I got the articles list from listing API
- Then fetched each article using the slug
- Saved the raw MD, with some meta.
const axios = require("axios");
const fs = require("fs");
const path = require("path");
const getAllArticles = async (listing) => {
return new Promise(async (resolve) => {
const articles = await Promise.allSettled(
listing.map((list) =>
axios.get(`https://dev.to/api/articles/mukuljainx/${list.slug}`)
)
);
resolve(articles.map((x) => x.value.data));
});
};
// Creates files from dev.to blogs
// exports.onPreInit === export const onPreInit
// or export { onPreInit }
// things differ due to different module format
exports.onPreInit = async () => {
return new Promise(async (resolve, reject) => {
const listingJSON = await axios.get(
"https://dev.to/api/articles/latest?username=mukuljainx"
);
const listing = listingJSON.data;
const articles = await getAllArticles(listing);
articles.forEach((article) => {
// meta for that file, can be accessed using query
const meta = {
slug: `/blog/${article.slug}`,
date: article.created_at,
readableDate: article.readable_publish_date,
title: article.title,
preview: article.description,
readingTime: article.reading_time_minutes,
reactionsCount: article.public_reactions_count,
commentsCount: article.comments_count,
url: article.url,
};
fs.writeFileSync(
path.resolve(`${__dirname}/src/pages/posts/dev-to-${article.slug}.md`),
"---\n" +
Object.keys(meta)
.map((k) => `${k}: "${meta[k]}"`)
.join("\n") +
"\n" +
"---\n" +
article.body_markdown,
"utf8"
);
resolve();
});
});
};
Check it on Github (Consider starting it, if you found it useful)
Build it
This is all but I still need to build my site to reflect the changes to automate it one can use CRON
jobs (free works on Cloud flare or Heroku), I am not doing it as it's not worth it right now. I just to have click "Build" on Cloud flare.
Even Dev.to provides generated portfolio/sites using stackbit, but the themes are limited and can't use the custom one. They might be using somewhat same approaches with CRON updates!
Either you can use this approach, API's mentioned above or Stackbit.
--EOF--
Sourced from dev.to 🧑💻 👩💻