SEO techniques tend to work when you've had your site linked to quite a bit of times by reputable sources. Titles, keywords, sitemaps, even meta tags are important factors no matter how any one search engine tries to downplay it.
First things I've noticed is that I don't have a sitemap and I don't have good meta tags. FB reads some meta tags for crawling content! I definitely need a way to submit posts semi-automatically (not automation, just augmentation!) to search engines in order to increase the amount of content seen by any search engine. I need a sitemap generator, so I've decided my backend will take care of generating that in a hybrid manner. I need to add more fields to my new post page to accommodate keywording/title/etc. for various metatags.
Second thing is looking to get more reputable sources to point to the blog and see if old sources can be modified to the correct links on the blog. I spent some time making sure that the old wordpress url fragments would be caught by my new blog and show the correct content. All social media sites are reputable sources. I remember the day when my blog exploded when I get #1 on hackernews due to some statistical/ml chrome extension I did in college. Yes, my blog (wordpress) sustained all that traffic amazingly! I attribute that to my performance driven nature.
Third, maybe server side rendering, so search engine crawlers that do not support javascript can render the content still.
I wonder how AMP works (not that I need it or anything). I wonder if Googlers can manipulate Google search results without getting caught. It would be prudent to have a keyword watcher to watch the content for the blog.
I thought of a few new features to better relate the content and to better describe the article related content. I'm using React for this blog, so this is will be a fun venture to add more features on top of the existing.