Instead of coding on , I ended up tweaking fixato.org/music/recommendatio a bit.
Sometimes you gotta switch it up with a fun instead, right? :)

Time for a bit more coding on , and archiving of personal content from the GoogleUserContent servers before it's inevitably deleted too.

Uff... I need to stay awaje long enough to get into ny second wind, so I can stay up to do some more coding...

Had a refreshing after having a day that had been kinda eating up my spoons.
Just sat down behind my laptop with a cup of tea after downing a mug of cultured milk with blackberry sirup. should be asleep for the night, so let's see if I can get some coding done. :)

Even though the APIs are no more, there are still some things I want to add to tools.

It's a bit sad to realise though that the things I write now, won't be useful anymore in a few weeks.

Right, I've been coding on and querying the various / endpoints, all through the afternoon, evening, night and morning, I think I deserve some sleep ...

Interestingly btw, the (Web and REST) APIs are already randomly returning HTTP 204 and 404 responses, even for valid requests, which forced me to implement automatic and manual retries of API queries.

Even if no one else will end up using these scripts, I do feel like I've learnt quite a bit about and collected a nice set of functions.

Show thread

Today's freshly committed code will not only request relevant JSON from the APIs, it will also combine it all into single structured JSON file per domain.

Hurry up though, the scripts rely on the which will go away on March 7th, so in a few days.

The most important will be to retrieve all the data; once the data is cached locally, future updates to improve structure and parsing of the JSON archive can work based on the cached results.

Show thread

update for owners who had Comments widget enabled for their blog:
github.com/FiXato/Plexodus-Too has had some big updates in the past week, specifically the github.com/FiXato/Plexodus-Too section.

bin/export_blogger_comments.sh can be used to archive all Comments for your Blogger blog.

Note, most of these tools can also be used if you use Google+ Comments for Sites.

Right... even though I'd love to take more advantage of the (mostly) undisturbed silence and peace of mind at night, I think I need to try to get some sleep.
I fear my code quality will otherwise suffer.
Hopefully I'll find a quiet and undisturbed moment tomorrow during the day where I can finish up the Blogger GPlus Comments export part of ...
With APIs shutting down later this week, time's running out.

Google+ Exodus Collection
If you want to follow my posts about the shutdown of Google+, my ongoing development of the set of export / migration utilities, and other news related to the Google+ Exodus, feel free to follow my G+ Exodus collection on Google Plus (till that ship goes under):
plus.google.com/collection/wak

Plexodus-Tools and sleep well #tootiverse 

It's 7:30am... I've managed to do some coding at the expense of a good night's rest.

But, got a small update. It's not really related to the tool itself, but rather a script to gather some data about duplicate files in the archives.

Anyway, time to get some shut-eye!
Sleep well, !

Don't think I'll manage to write more code tonight, especially now is awake again, so I've committed and pushed my code to github.com/FiXato/Plexodus-Too.
Besides, with how late it is, and how tired my eyes are getting, I doubt that any code I'll write now, will be any good anyway...

will have to wait some more. BeardGrabber needs some more attention now instead, and hopefully some sleep for his daddy. :)

*Google Plus Shutdown Expedited*
If you are a user, and were hoping you had till August 2019 to extract your data out of the platform, I have some bad news:
Google+ API will be shutdown in 3 months time.
Google+ for Consumers will shutdown in *APRIL* 2019 already, due to another fixed data leak bug: blog.google/technology/safety-

This means I have even less time to develop my , especially those that require the API.

Plexodus-Tools and my Google+ Circles stats 

Now that the merging of Circles Takeout and People API data part of my script is done, I can actually have some fun with the data.
As a result, this is the list of most popular profile providers from the 489 users I follow:
Twitter: 132
Facebook: 88
YouTube: 58
Blogger: 48
LinkedIn: 45
Flickr: 37
picasaweb.google.com: 35
Google+: 25
Google Reader: 22
Instagram: 19
GitHub: 19
quora.com: 14
last.fm: 12
pinterest.com: 11

Plexodus-Tools continued & endless refactoring 

Somehow I just keep refactoring the code of because I'm not satisfied with the structure, rather than publishing it as is...
The code is functional, but I can tell that it evolved from a proof of concept prototype. I really should pick up or again to sit down and first think about the behaviour and structure, before I start hacking away at it.
Did at least add a decent command line interface to it though.

Show thread

Plexodus-Tools continued 

I doubt it's something the average user would know how to use though, as it needs:
· Fairly recent version of
· OAuth client_id.json file generated through Console
· CLI usage or code editing
But maybe I can put a simple Sinatra or Rails frontend with OAuth G+ Signin button in front of it.
Then again, do I want to handle user data? (W|Sh)ould users trust a random site with their Google+ Circles Takeout archive? I know I wouldn't...

Show thread

That feeling when you kinda want to finish cleaning up your code so you can publish it, but you really should get some sleep instead...
Anyway, my tool for merging profile data from the Google+ People into the Google+ Circles Takeout JSON files is as good as done.
Gotta write some documentation for it tomorrow, and perhaps do a couple more testruns, also from non-clean states, to check if resuming works as expected.