rss_iconSo, I mentioned recently, I wanted to migrate off of my shared Hosting to a VPS on Digital Ocean.  One reason sited was more control over what I can do with the server.  It’s essentially just a cloud based Linux machine, I can do anything I would do on a locally hosted Ubuntu box with it.  I came across Tiny Tiny RSS recently, and it’s the perfect example of the kind of thing I wanted the VPS for.

While nowhere near the main reason, the final straw with my tolerance of Google’s increasing level of crap was the closing of Reader, a service I’d depended on pretty much since it’s inception.  I’d tried a few alternative solutions but nothing really did anything for me next tot he simplicity to Google Reader.

Eventually I just sort of lost the want for RSS feeds.  The whole web seems to be abandoning the idea 9probably because it’s not nearly as easy to plaster crap ads all over an RSS feed) so I just decided to let it go.

Recently I’ve been trying to find a good solution again.  I really hate not being able to keep up with infrequently updated blogs i find.  That’s like 90% of the reason i liked having Google Reader, so when that interesting niche blog I like that updates once ever 4 months updates, I can know.

I looked into some Firefox extensions but using them tends ot be clunky.  I’ve tried a few different apps on my phone but nothing is idea.  The biggest issue is a lack of sync across everything.


Tiny Tiny RSS is a self hosted RSS Reader.  You download it (with Git in this case), set up a database for it, and let it roll.  I’ve set it up on my little sandbox domain and added feeds I was pulling with other services to it.

It’s web based, so I can get to it from anywhere.  Need number one.

It’s hosted by me, so I won’t have to worry about some “thinks they know best” company screwing me over again, need number two.

There is a built in API so it can be access via mobile with an app.  Need number 3.  BONUS!  There is even a compatible Windows Phone app.

The next step is to figure out what I did with my old list of Google Reader feeds and start loading it up.


Josh Miller has been a hobbyist blogger and writer online since 1998 on a variety of topics mostly centered on Video Games, Toys, and Technology. Josh has also worked in the technical end of the television industry for many years.

More Posts

Follow Me:

Homemade Pizza

Pizza Ingredients
Pre Cooked Pizza
Cooked Pizza


Josh Miller has been a hobbyist blogger and writer online since 1998 on a variety of topics mostly centered on Video Games, Toys, and Technology. Josh has also worked in the technical end of the television industry for many years.

More Posts

Follow Me:

Testing The Waters on Digital Ocean

A while back I set up a Digital Ocean VPS running OpenSIM.  I then promptly forgot about Digital Ocean.  Part of the problem was I forgot my password and form some lame reason the email didn’t show up when I searched my emails for “Digital Ocean”.  It was also more of a fun side project that didn’t really cost anything since DO gives you some credit when you sign up.

I’m thinking of using my VPS for a bit more though, specifically, as my new Web Host.  I currently use GoDaddy which works great and is affordable but I’m starting to do a bit more experimenting and coding and I feel like i could benefit from something with a little more versatility.  For a few bucks more than I currently pay for Godaddy I can get a pretty decent VPS going and at the very least host all of the blogs I currently maintain on it (currently 4 with little to no traffic, and 3 with reasonably light traffic).  i can always beef up the VPS if the load ends up being too much.

I wanted to test out the migration and set up though, I was going to move Raid-Tier over but it’s kind of in a state of limbo and i wanted something I use with some content behind it.  If I am going to discretely move my wife’s blogs over interruption free I need to KNOW I can do it and KNOW it will work.  One, she’ll get pissy if it doesn’t and two she is getting a fair amount of traffic and I’d hate to interrupt that.

So I moved here over.  I also moved my little Sandbox Project over at over as well but mostly because it’s inconsequential if it gets lost somehow and I wanted a second domain on the hosting so I could make sure I’m doing the server configuration properly.

The migration wasn’t without issues.  For one, the SQL export from this blog is larger than I could get the SQL locally to import so I had to do an old fashioned WordPress import/export.  In my experience the WordPress import/export works great for small volumes of data but extremely poorly for large volumes of data.

I also had FTP issues, All of the help files for vsftp I could find were outdated (not uncommon with trying to solve Linux issues) and there is some newer “feature” i couldn’t figure out that seems to amount to “vsftp won’t run if root has ftp access”.  I’m not sure that’s right because like I said, I didn’t figure it out, i used SSH file transfer instead.  I needed to move all of the images from the old host to this host, all 3-4000 of them.  Its not a huge amount of files but it is a LOT of files.

i still have permissions issues I have not figured out.  Permissions are probably the most annoying part of using Linux, yeah yeah blah blah security, I get that, but fuck there is all this users and groups and who owns the files and who can write/use the files and what user and group are the processes using.  It’s kind of insane.  I even tried the whole “give everything full perms chmod 777” method to no success.

Which has left some broken internal links on this blog.  I’ll do some backend SQL work on it but basically, I set up the old blog to run on /Year/Month/Day/PostTitle Permalinks and the new one uses the ugly post?### style.  for some reason WordPress can’t change this setting in .htaccess and it still doesn’t work when I manually create the .htaccess file.  So screw it, for now it’s ugly links all around.

Phase two will be to pull Lameazoid and Raid-Tier over, if those go smoothly I’ll start with my wife’s less used/trafficked blog and see if she even notices, then work my way up from there.  The whole process actually should be seamless all around since ultimately the domain will point to the same structure and data on the same domain.

The point is do get a more versatile host to do some more complex projects without paying for TWO hosts.  I’d rather pay more for one host than putz with two hosts which end up costing more.  I also still plan to keep Godaddy for my domains for now, I’m not unhappy with the service over there by any means, I’ve just outgrown it.


Josh Miller has been a hobbyist blogger and writer online since 1998 on a variety of topics mostly centered on Video Games, Toys, and Technology. Josh has also worked in the technical end of the television industry for many years.

More Posts

Follow Me:

Why Can’t I Hold All These Devices?

There are 5 people in my household.  At this point, each of these people has at least a laptop and a tablet and the majority of them have a handheld smartphone style device (only one is actually a phone with data).  There are several game consoles and media devices, a couple of additional desktops, some security cameras, my Pi projects, etc etc.  A quick rundown gives me 33 Devices, though there are more that are not frequently active.

At some point, it became necessary for me to take control and actively manage my home network.  I was getting issues with double assigned IPs from DHCP, I have files on shared drives which need to have static IPs, I needed to implement security and filtering on the network for the kids to keep them from doing things they shouldn’t be doing online as well as tracking usage.

It also helps with security because I know what devices are online and if there are “outside” devices on the network.

So, a quick basic rundown of networking.  Every device, from PCs to Xboxes to iPods, get a unique IP address, most commonly on home networks this will be 192.168.1.XXX.  This is ahow data is pushed around, data has a header that says “I need to go to 192.168.1.XXX, where is that?” and routers and switches push this data around appropriately until some device says “Here I am, send it to me”.  This is really really generalized but it’s the basic idea.

These IPs can often be set up to be static (always the same) on the device, it can be assigned randomly from a pool by the Router (DHCP) or it can be assigned to be static by the Router based on the Device MAC Address.

Every device also has a MAC address.  MAC Addresses are unique to the device interface.  Think of it as a fingerprint.  I say “Device Interface” because if a machine has multiple network interfaces, say, WiFi and a Network cable jack, these will have different MAC addresses despite being one device.  In most routers, you can set up a table of MAC Addresses and tell the router “If you see this MAC, assign it the IP X”.

This is really useful for things like Laptops, Phones and Tablets.  For a machine like a desktop PC or a server that never leaves the network, it may be better to assign the IPs on the device, that is, the Device connects and says “i want to be assigned IP YYY”.  If you have a Wireless device assigned with a static IP, it can cause trouble when that device travels out of network, the static IP may not be available at say, your friend’s house or a coffee shop.  The remote location may use a different IP scheme, they may have their Router assigned to a different IP, there may be another device already using that particular IP.

So why assign IPs?

File servers really need static IPs.  If other devices are connecting to another machine to get say, photos, that other machine, the server, needs to always be in the same place.  Imagine how hard it would be to go to your friend’s home if their home was always in a different location and every building looked identical.

This also avoids IP conflicts.  This is less common since the Router is supposed to not double assign IPs but occasionally if a device disconnects and reconnects while another device is reconnecting, the IP may accidentally become double assigned, which means those data packets go nowhere.  This would be like trying to go to your friend’s address but there are two homes with the same address across the street from each other.

Assigned IPs is also great for security.  Limiting the IP range of DHCP, or limiting the number of devices that can connect keeps the network from getting over loaded by random people, though with a WiFi password this wouldn’t happen anyway.  You could also limit the capabilities of IPs connected through DHCP.  With scanning software you can also know at any time what is connected to your network.  It can also help diagnose issues.  If your unable to get your Wii to watch netflix, you can run a scan and if everything shows up but the Wii, you know the issue is probably on the Wii itself.

It’s also been good for my own experience in better learning methods for managing small networks and configuring the router.  I started off with a list of devices in a spread sheet.  I then gathered all the MAC addresses through a combination of scanning the network or checking the device itself.  Most devices will show you the MAC address in the settings somewhere and if all else fails it’s often printed on a sticker on the back side.

i then sorted out blocks of IPs based on device and sorted everything into these blocks.  This helps organize things.  The only thing that changes is the 4th octet of the IP, so everything is 192.168.1.XXX.  From here I use the following schema:

  • 01-09 = System Devices, the Router, the Wifi Access point, my NAS.
  • 10-20 = Game Consoles and media Devices
  • 21-29 = Desktops, of which there are 4.
  • 30-39 = Handheld Devices belonging to the Kids
  • 40-49 = Laptops
  • 50-59 = My Devices
  • 60-69 = Reserved for IP cameras
  • 70-79 = Reserved for Raspberry Pis and other Internet of Things style devices.
  • 100+ is used for DHCP assignments

This can be modified based on personal needs of course.  The idea is essentially that if nothing shows up under the 100+ IP range, I can know at a glance that nothing unknown is attached to the network.


Josh Miller has been a hobbyist blogger and writer online since 1998 on a variety of topics mostly centered on Video Games, Toys, and Technology. Josh has also worked in the technical end of the television industry for many years.

More Posts

Follow Me:

Ubuntu Again

I’ve been chugging away on the same laptop for many years. now.  That old <a href=””>HP Mini</a> I’ve been using for years?  I’m still using that, with all it’s netbook and ATOM procesor glory.  The screen is a little flakey at the wrong angles but it gets the job done.  I’ve been sharing it with my wife for a while now, but recently (over the holidays) we bought her her own laptop.  Now that she’s blogging, it’s inconvenient for both of us to share a laptop since inevitably, both of us want to use it at any given time.

It’s also becoming increasingly more unstable, moreso when she is using it.  Part of the issue is that it has been running Windows XP since forever.  I have dual booted many flvors of Linux on it over the years and even ran the Windows 8 Beta on it for a bit.  Unfortunately Xp is completely and utterly end of life from Microsoft and more and more it shows.  The browser compatability is less and less and it’s just not as capable as it was in the past.  Unfortunately, I’m not about to shell out for a new updated version of Windows.  Since I don’t have to deal with the larning curve of teaching her how to use linux, I am not free to go back to Linux on my laptop, specifically, Ubuntu, and specifically, ONLY.  No dual booting or any of that nonsense.

I already had Lubuntu installed on a spare drive that I swapped into the machine, but I had issues getting networking to work in Lubuntu. I probably could have fixed them but I opted to just blow it out for a fresh 14.04 Ubuntu install. Unfortunately and irritatingly, the WiFi issue persisted. The core issue is that the Broadcom driver needed is “proprietary” ie not Open Source (though it is free), so it’s not installed or included by default. This problem is compunded because the age of this device means all of the tutorials are outdated and suggest I install the “jaunty backports repositories” and restricted drivers or something using Synaptic.

The specifics are not important, what’s important is Jaunty Jackelope was like 4 distros or so ago and Synaptic is no longer the package manager used. I got it working but it’s always been an issue. At least it seems this round the system defaults to “disable touchpad when typing” so my coursor isn’t flying all over the place.

So anyway, new year, new… ish… OS. Not much else will probably change on my whole projects and workflow end. I can do most everything i need to do besides play games with Ubuntu and the NAS means I can get to my files reguardless of OS. The only real issue is rejiggering my blog workflow, but lately I haven’t been posting shit anywhere anyway so it really doesn’t matter much. At the moment I’m writing this with Pico (because vi is a piece of crap), but I am not real sure there is any way to push this into productions without just cut and pasting. Also, word wrapping is non existent, which makes it tricky, what with hard line returns and junk that will probably cut and paste like garbage.


Josh Miller has been a hobbyist blogger and writer online since 1998 on a variety of topics mostly centered on Video Games, Toys, and Technology. Josh has also worked in the technical end of the television industry for many years.

More Posts

Follow Me:

Synology Phase 03 – The Apps

ScreenShot253 The key component of the Synology is the Software.  You can buy cheaper NAs devices if you just want a network storage device.  Honestly, the price justification is almost entirely in the software, though the Hardware RAID (as opposed to Software RAID) is a partial factor as well.  The box itself isn’t all that sophisticated or exciting honestly.

I don’t plan to cover every available app by any means this is just sort of a run down of some of the apps I find useful, and probably the ones that are most commonly used.

Photo Station

The main reason I even bothered with investing in a NAS at all was because of my Photo collection.  Everything else is a bonus.  I’ve got 250GB of family photos I’ve taken over the years, plus a whole mess of other photos from Blog Posts etc.  I’ve gone through a drive crash on them and the drives I’ve been using are aging rapidly.  The wife doesn’t care to lose photos either.

Photostation is of course, made for Photos, and it’s pretty great.  The interface is very similar to OneDrive, which is is probably my favorite interface of all the photo services I’ve used.  Everything flows together to make a wall of photos, real great for easy navigation.  There are also tagging functions which I plan to use later once everything is loaded.

Audio Station

I’ve tried several solutions for streaming my audio collection to my phone.  I had sort of mostly settled on Google Play Music since it’s the only service that would let me upload my entire collection of some 20,000 songs at once.  I don’t really NEED all these songs at once but I do like having the option.  I buy most music these days via Amazon and so streaming via Amazon is always an alternative as well. 

Unfortunately, my recent conversion to Windows Phone from Android means neither service is an option.  I’ve been back to putting music on my device like a caveman.  Fortunately, there is a DSMusic app available for Windows Phone, and I don’t have to worry about any limitations of any service, it’s just all there.  Now I just need to build some good playlists.

A secondary benefit here, Google Play Music was one of the few Google Services I still sometimes used.  I’ve worked pretty hard to divorce myself from all things Google for a variety of reasons, primarily privacy concerns and secondary they are starting to push their own semi proprietary services on the web over long standing more open ones.  Basically, they are using their considerable size to bully everyone to their methods.  "Don’t Be Evil" doesn’t seem to be a thing anymore, but anyway, Google isn’t the topic here.  Having a good Google Music alternative that works on WP is.

Download Station

I don’t use Torrents too often, mostly for my Humble Bundle downloads, but the Synology has a really nice built in Torrent client.  I don’t have to worry about keeping a program running elsewhere or drive space on my PC in use, it just downloads them right to the NAS.  There is even some auto extraction settings, though I have not looked into those yet.

Web Station

A nice little bonus here I wasn’t quite aware existed before buying this NAS, it can function as a web server.  I’ve long given up hosting my own web sites from my house but I do keep some wordpress and other files on an internal web server for archival purposes.  The web server has allowed me to archive these files off my Ubuntu Server to the NAS, which also means I get the backup functionality of the NAS itself.

Note Station

Another unexpected surprise, though I have not explored it completely yet, the Synology includes an app called Note Station, which can sync (or at least download) from Evernote.  I’ve been racking my brain for a while on a good way to backup my Evernote notes, with ideas ranging from Print to PDF using some script to just pulling it weekly to a PC client.  Problem solved.

Cloud Sync

Another nice backup feature, The Synology can hook into and sync with both Dropbox and One Drive.  I use One Drive for some backups and Dropbox for some phone syncing so pulling both to an internal local storage is a plus.  I may even look into using One Drive as a secondary backup like I had originally planned.  Office 365 now includes unlimited storage on One drive, in addition to client licenses for 5 copies of Office (there are conveniently 5 people in my family all with PCs).  It’s a really tempting offer, and with it I could set up the Synology to start pushing all (or select) data to the cloud for an offsite backup.

Surveillance Station

Last, and the ONLY feature I have been disappointed with so far is the Surveillance Station.  I mentioned recently setting up cameras for monitoring and security.  I currently have three cameras and may install a few more.  The Synology only allows the use of 2 cameras before needing to purchase additional licenses.  I’m not super irritated about the additional cost, I get the whole "It supports dev costs"" thing.  My problem is that licenses are $60+, EACH.  If I wanted to add another 3 cameras like I am considering, I’d need 4 more licenses, or $240.  I’d be alright with maybe $10-$15 per seat, or even $60 for "unlimited" (within the capacity of the device) but $60 each is a little ridiculous.

I suspect there is some lame ass license fees Synology has to pay to someone involved but that is also kind of giving them the benefit of the doubt.

I’m still super satisfied with the box, but having better/cheaper access to Surveillance station seats would be eliminating my Ubuntu server completely.


Josh Miller has been a hobbyist blogger and writer online since 1998 on a variety of topics mostly centered on Video Games, Toys, and Technology. Josh has also worked in the technical end of the television industry for many years.

More Posts

Follow Me: