Category Archives: IT

Apache mod expires and Puppet Syntax

Puppet is an awesome tool, but it can get pretty esoteric as it’s modules have configuration option after option. When you encounter parts that are seemingly little used the normal documentation can get thin. mod_expires for Apache took me a while to figure out, this post will save you some time.

puppet apache mod expires example
Woo! It works! Watch that jpeg MIME type though… see a later working revision below!

What are we trying to accomplish, anyway?

Continue reading

ROKU and Credit Cards

Last fall I was able to get in on this great ROKU Ultra deal, $50 for a $100 streamer! I was replacing an Amazon Stick for my in-laws as a Christmas gift and I was high on it for two reasons. The first is that it was one of the larger units, physically, and that let it house more higher end hardware with ideally less stability issues than the Amazon stick. The second perk was the “premium” remote that had all the buttons but also the killer feature – the headphone jack! Hearing and volume issues being what they were this seemed like a great way for one of them to use the TV to watch and keep the audio from blasting. There were a couple of bad surprises when I opened the box, however…

Continue reading

Google Chromecast Broken in 2019

I wanted to like this thing….

Streaming Internet video is taking over the way many people consume their television and movies and there are more and more ways to accomplish this. The Google Chromecast seems to be a much loved tool for this and can be found for $30 or less nearly anywhere. I had gotten one expressly to be able to screen share from Android devices a while back and it had set in a drawer, lonely and unused. Then, just before cancelling the Chromecast Audio (which I continue to love) they announced that it was now possible to add a Chromecast (video) to a speaker group. Awesome! Except.. not.

Continue reading

Working With Puppet

Moving on to some new challenges, I had to face the reality that some of my skills and operating procedures were dated. In this brave new era of Cloud-everything – which is really a simple way of saying software defined – what’s really nice is how disposable everything is. It is possible to get a new server instance very quickly and if you “recycle” an existing one DNS propagation is a non-issue and you can literally refresh a server in minutes. Many times you could redeploy a fresh server and your application faster than you could trouble shoot it.

That said, this automagic configuration is not free.

Continue reading

Percona apt repo with Key – Ubuntu

puppet code snippet

It works!

 

 

 

 

 

Using Puppet for configuration management is great.  So is using the high performance Percona DB.  Ditto for Ubuntu 18.04 LTS Bionic.   The issue arises when you are combining them and realize that all of the modules and easily located online resources use the old and deprecated short key, you can read more here:

https://www.percona.com/blog/2016/10/13/new-signing-key-for-percona-debian-and-ubuntu-packages/

You can read in the comments that some people asked for the current key… but didn’t get it.  I found using the installable .deb file that Percona provides to be a PITA with using the PuppetLabs Apt module:

https://www.percona.com/doc/percona-server/LATEST/installation/apt_repo.html

I jumped through some hoops (I should probably have documented that…) with gpg commands and the deb package and determined that the current repo public key is currently:

4D1BB29D63D98E422B2113B19334A25F8507EFA5

Want to put this to work in your own Puppet Module?  Here you go:

apt::source { 'percona':
  ensure   =>; present,
  include  =>; { src => true },
  location =>; 'http://repo.percona.com/apt',
  release  =>; $::lsbdistcodename,
  repos    =>; 'main',
  key      =>; '4D1BB29D63D98E422B2113B19334A25F8507EFA5',
  } ->;
  Class['apt::update']

This should work fine with other flavors of Ubuntu like 16.04 and you can also use it to get a repo installed for xtrabackup (make sure you use apt to get xtrabackup24 otherwise you’ll get the very aged version from Ubuntu sources).

VHS to DVD Conversion

A while back, maybe a few years back, Kristin said that I should convert our tapes to DVDs and get rid of them.

She may have said this a few times.

Then it made it onto my official “to-do” list, and I am pleased to say that I am now taking this seriously and have acquired the means to do so.

At one point I thought I would just use Windows Media Center and record from the TV setting.  This seemed simple and genius – I have TV tuners to spare.

It also doesn’t work.  Sigh.  Some technical reasons that I don’t really get prevent this from happening.

From there I was stuck, but the Internet is full of information.   I found this article and promptly picked the worst, cheapest converter on the list.  Reviews at both Amazon and NewEgg convinced me that with a little perseverance I could make this work.  I’ve modified drivers, spun up Linux VMs to repackage installers, etc. so I hoped that I could make it work.

Turns out I didn’t need to hope.  This Kworld USB converter works fine with Windows 7.

The important bits were covered by this NewEgg review:

Here’s how it works. You plug the USB into your computer. You plug the audio cable into your computer input as well. You install the driver from the driver disk, then you install the two porgrams from the program disk which are both included in the box. You then hook your VCR up to the RCA or S-Video jack. You attach your audio cable from the VCR to the RCA Jacks. If you only have one audio cable, use the left channel. Open the Power Director program and use the capture tab on the upper left side of the screen. It should then look for the signal, and then show you what your VCR is doing on the little screen. To record, DO NOT use the red button on the program. I keep getting copyright protected errors. Instead, press the button on the kworld device (it’s oval and should have a green light lit next to it). It will start recording the video. Make sure you have it going into the file you want and in the proper format. It works, but could use much better instructions.

One brief stumbling block that I had was that I wasn’t getting any video.  Messing with the VCR, it appears that the front outputs are no longer working – ore are secretly inputs?  No idea, but plugging the cables into the back of the VCR brought up the image fine.  Issue number two is that there wasn’t any audio during the capture so the only way I could judge audio sync was by stopping the recording and watching the preview.  Not good.  Googling it led me to this forum entry and this answer:

You can always monitor the output audio of your VCR (headphone) or video camera speaker while playing back the video to be captured, or also use the windows recording mixer to monitor, so you can know if your material have audio or your audio channel is having any problem. But if you can find out how to monitor the audio level during capturing on PD9, please let me know.

It was really that easy, I opened up the “Recording Devices” control panel by right clicking on the little speaker system icon by the clock.  Then I checked this box:

ListenLineIn

Check this box!

Now I can hear the capture audio while it is in flight over my PC speakers and have a good idea of the incoming volume.

Next up – re-encoding the files in Handbrake for portability.

–Nat

 

Now with better performance…

We’ve been trying to make Magento perform better – and one of the most simple things to do is have your php code cached. APC is a package that does this…

Since Ubuntu VM is of an ancient distribution (8.04LTS) I couldn’t do it the super easy Zend Framework way, but this blog got me through.

http://www.mcdruid.co.uk/content/installing-apc-on-ubuntu-linux-and-benchmarking-drupal-6-performance-improvement

Down in the comments there is a helpful post about using wget and compiling it – worked like a charm! Then just move the apc.php file into your wordpress directory and bam, off you go!

Probably should secure that somehow…

–Nat

Crashing Crashplan

I’ve recently started to use Crashplan to back up a rather large file server. It was crashing repeatedly around ~1.1TB and ~300k files.

The error message we were seeing on our remote host was “target lost” which led us to many hours troubleshooting disk performance and network connectivity. After attaching a “local” disk to the VM for local backups, waiting the ~12-14 hours for the initial backup to get to the same spot – and then fail – it appeared that it was something more systemic.

Contacting Crashplan support yielded this very helpful response:

Crashplan Rep Response:
It appears that the CrashPlan backup engine is running out of memory.

Running Notepad or any other text-editor as an Admin, edit the CrashPlan engine’s CrashPlanService.ini file to allow it to use more java memory:

1. Stop the backup engine: http://support.crashplan.com/doku.php/how_to/stop_and_start_engine

2. Locate the Notepad program, right-click and Launch as Administrator

3. Go to File > Open, and navigate to C:\Program Files\CrashPlan\CrashPlanService.ini

4. Find the following line in the file:

-Xmx512m

5. Edit to something larger such as 640, 768, 896, or 1024. E.g.:

-Xmx1024m

This sets the maximum amount of memory that CrashPlan can use. CrashPlan will not use that much until it needs it. I would recommend starting out setting it to 768, and go higher only if you continue experiencing problems. You can set it as high as 2048 on 32-bit systems, or even higher on 64-bit systems.

6. Start the backup engine.

Outcome:
We set it to -Xmx1024m after increasing the memory allocation by 1GB as well. The server is running like a top and backups are consistently running successfully.

Troubleshooting backups, especially mulit-TB datasets, can be a huge pain as they take so long to redo and reproduce. Props to Crashplan for getting back to me within two hours on our free trial, which has since been converted to their family unlimited plan for two years. *thumbs up*

–Nat

Editing DNS in Ubuntu 12.04 Server

http://askubuntu.com/questions/130452/how-do-i-add-a-dns-server-via-resolv-conf

That link really helped me out.  Essentially you just follow through this example:

Edit the /etc/network/interfaces file. The same configurations that you would have written to resolv.conf can now be in the same file as your network adapter configurations like the example below:

# The loopback network interface
auto lo iface lo inet loopback

# The primary network interface
auto eth0 iface eth0 inet static
address 192.168.1.2
netmask 255.255.255.0 
network 192.168.0.0 
broadcast 192.168.1.255 
gateway 192.168.1.1

dns-nameservers 75.75.75.75, 75.75.76.76
dns-search local
dns-domain local.domain

I found this sample very handy in setting up a 12.04 Server instance. So many posts are about the network manager, but that isn’t available with out installing many other packages, etc.

–Nat

Steam & Your Small SSD

I just finished “upgrading” my main PC for the first time in almost two years, and this is the first mainboard and CPU upgrade in nearly four.  For those of you reading familiar with my PC upgrade habits you know that is like having and Ice Age occur, having the glaciers come and retreat and the Earth turning green again between upgrades.  In truth, this is fairly minor upgrade in that I bought nothing new, save a $30 case in order to pull it off.

Well, in truth I did buy a shiny, new 180GB Intel Solid State Drive.  Yes, it is Sandforce based, which I vowed never to buy… but it is also Intel, which I always promise to buy but then shrink back from the cost… *shrug*  I had completely grown out of the 40GB Gen2 Intel SSD however, and so this purchase was completely spousal approved.  I am typing from the very machine which I put together, fancy water cooling kit and all.  It’s a little louder than I would prefer, but the big upgrade comes a year from now and I’ll save the money and trouble until that time…

ANYWAY – the main event.  I put Steam and all of my Steam games on a Western Digital Black 640GB 7200 RPM drive.  It is plenty speedy for the load times for games, but Steam annoyingly always took a while to launch and the UI was painfully laggy compared to the apps installed onto the SSD boot drive.  A bigger SSD makes this all better, right?  Well, at any given time I have over 200GB of Steam games installed, not too mention the ~30GB of Blizzard games sitting on the hard drive.  There wasn’t room in the budget for a 512GB or 600GB SSD (I paid $130 for 180GB, a 512 is at least $350 if not $4-500), especially given the minimal increase games gain by being installed to an SSD.

Simple solution, right?  Install Steam to the SSD, install the games to the spinning cheap drive, call it a day!

If only it were that easy!  Steam installs all of the games you manage through it in the same directory that you install the Steam application into.  Remember, I have only a 180GB drive and 200+GB of games installed.

Enter “symbolic links” and easy apps like http://www.traynier.com/software/steammover/ that allow you to move installed games to a secondary drive by using a clever trick of NTFS.   This means you can have your cake and eat it too – a minority of your games are on the SSD, along with the core Steam files, meaning the best possible performance while the games you aren’t actively playing or are too big are more economically stored on spinning disk.  I started using this tool, which meant I copied my downloaded games from the steamapps/common directory from my old Steam install (the same secondary disk was in my old system, carried it over to the new system) over to the SSD, then the handy tool moved the data back to the secondary directory on the spinning drive.

This took a while, even at ~100+MB a second.  Plus, writing data to an SSD wears it out, so it should be minimized if possible.  I got to Rage, the newest game from Id that is ~21GB on disk and decided there had to be a better way.

I found a great work around.  Now, I create the same directory in the SSD steamapps/common directory and copy the .exe and other miscellaneous top level files from the spinning drive, which is about 30-50MB depending on the game, which takes less than a second.  Next, I use the tool to “move” the game from the SSD to the secondary drive.  Finally, I cut the massive files out of the original steamapps/common/game directory and paste them into the new directory on the same drive and partition.  Since this is a simple modification of the file system tree and no data gets moved, it is essentially instantaneous.

Win for me!  Hopefully a win for you!

Note – the Valve games put their darn big files right in the root of the steamapps directory, so this relocation trick doesn’t even work on them.  If you are planning on playing TF:2, L4D, CS:S, etc you are going to need a decent amount of room on the SSD to pull this off.  I wouldn’t do it with less than an 80GB SSD.

–Nat

(also, not shut out in July! :))