Thursday, November 26, 2015

Cracked!

Last week I discovered some open outgoing SSH sockets on my "hardened" Red Hat gateway machine.  A little more digging revealed a web server connected to those sockets, but I haven't yet been able to find the files being served.

I'll admit I had become complacent about updating the system, since it had been working with perfect performance.  Being lulled into a false sense of security is no excuse!

I haven't yet had the time to rebuild a hardened server from scratch (it is a ton of work), but the first two quick fixes I did was to disable httpd and to restrict ssh to only my internal NIC by adding a ListenAddress entry to /etc/ssh/ssh_config.

However, I'm not much of a sysop or IT specialist, and I lack any real system hardening knowledge.  Most of what I did was to dumbly follow various "best security practices" advice from distro makers, government agencies, and security companies.

Clearly the crackers are way better at their game than I am at mine.

So I decided to look at what's available in security-focused routers and immediately stumbled upon the Tuirris Omnia Indegogo campaign.  For US$209 (shipping included) I'll get a powerful OpenWRT-based gigabit router and a/b/g/n/ac MIMO access point that includes a monitored honeypot and automatic security updates from a highly-rated Czech provider.

Since the router won't ship until next spring I'll still have to (re)harden my Red Hat box, but after that I'll let the professionals provide my first line of defense.

Thursday, October 1, 2015

The Sound of Silence

My Windows 7 home theater PC (HTCP) video and sound outputs both connect to my Sony receiver over HDMI, with the video routed by the receiver to the TV. Both the sound and video quality are excellent.

But there's a problem when switching between programs sourcing sound on the PC: The receiver makes a very loud pop/boom when the first sound is played after startup. After startup, the first couple seconds of program sound (such as a YouTube video) would be missed, though the sound would then continue without the pop/boom.

It took me a while to troubleshoot this phenomenon. It turns out that Windows completely stops outputting digital sound when there is no active sound source. That is to say, it does not output "silence" (a continuous sound stream containing data with a digital value of zero), but instead halts the entire Windows sound subsystem. So, when a program running on the HTPC starts to output sound, the Windows sound system has to be reinitialized, which can take a couple seconds.

At first, I suspected this was a Windows sound configuration issue. I did several searches and found no solution that involved changing system settings. I next looked for utilities or other apps that addressed the issue and found nothing.  Clearly, relatively few people run their sound through a receiver.

Under Linux, the fix would be simplicity itself: Run the sox program and tell it to output zeros.  Since I had Cygwin installed, it took only a moment to find that sox was indeed available under Cygwin, so I installed it, opened a terminal window, and entered the following command:

sox -n -d -q &

Ta-daa!  Problem fixed.  But I'd prefer the solution to be automatic, not manual.

The next step is to get sox to start when the HTPC boots, preferably so it will automatically restart sox if it ever exits.  Under Linux, this kind of thing is handled using cron or one of its successors.  I could install Cygwin cron, but is there a direct way to do this under Windows?

Sure!  Use the Windows 7 Task Scheduler.  You can follow this recipe combined with this one, or one of the many other similar recipes available.  Just be sure to run the task with Administrator privileges, which sox requires to access the sound hardware.

Tuesday, September 1, 2015

Breadboard Vise

On 14 August 2015, Hackaday mentioned the 3D printed breadboard vise described in this blog post by Pat Regan.  If you haven't already, please read Pat's post.  Everything that follows assumes you've memorized that post, including the embedded video.

For me, this was a revelation.  I had previously been using plywood, double-sided tape, duct tape and modeling clay to hold my breadboard in place and keep everything else in close proximity.  A real mess that often resulted in jumpers getting dislodged or completely lost.

So I wrote Pat and asked him to sell me a Breadboard Vise, and to my surprise he said "OK"!  He offered me the one pictured in his blog post and video, but also said that if I didn't mind waiting a bit, there were a few tweaks he'd like to make.  Of course I chose to wait!

Well, my vise arrived today (31 August 2015), and I immediately filled it with the main components of a two-part project I've been working on:

My gear in Pat Regan's vise

That's an ESP-12 on top, and a RasPi2 with PiCam below, with a Pi breadboard breakout in the center.  That PiCam had been flipping around and getting in the way.  Not any more!  Needless to say, the vise works exactly like it does in Pat's video.

A close look at the image will show that the RasPi2's power, HDMI and audio connectors are blocked by the vice jaw.  I tried rotating the Pi 90 degrees (I don't need the Ethernet or lower USB connectors), but the vise jaws won't extend quite that far.  If your board's narrowest dimension is greater than about 2.5", then it won't fit.

There are two workarounds; a hard one and an easy one.  The hard one is to design a jaw that has cutouts for the RasPi connectors, while still being strong enough to hold the board securely.  The second (and easier) workaround is to mount the RasPi to an adapter plate (nylon standoffs and a hunk of perf board), and put that in the vise.  Or maybe 3D print a RasPi carrier that could also serve as the base of a RasPi enclosure.

Opening Pat's files in OpenSCAD shows relatively few lines of code are needed to describe each part (the vise jaw and the vise base).  Though this was my first time using OpenSCAD, I did notice some key design items:
  1. A fudge-factor in the code shows it took some experimentation to find the optimal clearance for the arms of the vise jaw to slide easily into the base while not being loose.
  2. The bottoms of the vise jaws do not reach down to the bottom of the base.  This ensures the jaws can be easily opened even if the base is fastened down.
  3. The bottom of the base is solid.  While this increases filament use and print time, it ensures the breadboard is securely held, and that the vise arms won't snag from below.
 As Pat mentioned in his post, he went to great lengths to have a single jaw design that would work in all four locations in the base while still being strong and completely functional.  After the idea itself, this is my favorite aspect of the design.

I'm still not sure why the vise arms don't snag or jam when moving in and out of the base, even when slightly cocked.  The clearances are part of it, but it must also have something to do with how the part was printed, or perhaps how FDM printing itself works.  Anyhow, I like it!  This vise is effortless to use.

And in case you were wondering if you should trust your breadboard and other boards to a 3D printed vise held together with rubber bands, here's the money shot, taken with the board held upside-down at waist level:


You'll notice I'm not using the breadboard retention rubber band Pat uses in his video:  The version of the base he sent me (which may not be the one on Thingiverse or Github) retains the breadboard just fine without it, though the slot will probably remain for another iteration or so.. The screw holes are also visible, where were also added since the original design.

I was going to mention some minor printer issues that are visible in the vise Pat sent, but there's no need to, since they don't affect vise operation in the least.  That's another credit to Pat's design, which to me means it should be printable by just about anyone on just about any printer, perhaps using just about any rigid filament.

Want one of your own?  If you lack a 3D printer, Pat will be selling various versions of his vise on Tindie. I'll update this post when his store goes live.

Me, I think Pat should consider going straight to Kickstarter and get injection molds made.  I absolutely believe all Makers will soon think it is silly to have a prototyping breadboard without a board vise.

The definition of a hobby like 3D printing is much like the definition of a boat: A hole into which you throw money.  Once in a while you see someone make something that suggests maybe his hobby should return some of that money.  I really hope this project enables some great things for Pat!

That's really all I have to say about Pat Regan's Breadboard Vise.  It is a sensational solution to a persistent problem.

But there's more I'd like to say about Pat.  We averaged an email a day during the two weeks between the Hackaday post and when my vise arrived.  He has been a very generous correspondent, taking me through his process, his equipment, and his own experiences with 3D printing and the 3D printing community.

Prior to seeing this vise, I had no intention of getting a 3D printer.  Not any more!  I think just I needed to see a project I absolutely needed, one that made ideal use of a 3D printer.  Having seen Pat's design, and starting to play with OpenSCAD and slic3r, my prior excuse of lacking mechanical engineering skills no longer applies: Pat has shown me it's mainly geometry and math, and the willingness to make some bad parts along the way.

Thursday, August 20, 2015

On The Other Hand...

In my prior post I implied it was OK to start building the Gate Monitor using an ESP-01 and 3 AA batteries.  I was wrong.

I had neglected to include the difficulty of disabling all the handshakes and timeouts inherent with WiFi and TCP.  While it is certainly possible, it would take lots of work to figure out, implement, and debug.  In fact, I found no evidence that anyone has ever done it!  To me, this means the project could be much more complex than just wiring up a switch and batteries, then writing a few lines of code.

A possible halfway position would be to place the AP in RTS-CTS mode and set the AP DTIM value to 3, then deep sleep between DTIM intervals.  While this would work well for the node, it would wreak havoc with other WiFi users, possibly reducing overall throughput by up to 80%

This would be practical only if the ESP-01 nodes were all on their own private WiFi network, separate from all other traffic.  While cheap $20 b/g/n APs are available, it would still be overkill if all we want is a single simple gate monitor.

So, let's say the ESP-01 is most easily used when it's WiFi is operating normally, which means an average ~70 ma consumption with minimal transmission.  Which means disposable batteries just became a greater cost.

There is an alternative: Piggyback the ESP-01 on a much larger battery-powered host.  In my case, I have some PIR-activated 60-LED solar security flood lights.  Adding the ESP-01 should not significantly affect flood light activation.  And I've been thinking about improving the lighting at the gate, so moving a solar flood there makes sense.

We'd need to add a $5 switching regulator for the ESP-01, since the LEDs are connected directly to the battery.  The PIR sensor is powered separately, but it's supply couldn't also power the ESP-01.

This would have the added benefit that the PIR sensor could also be monitored, permitting a message to be sent whenever someone approached the gate.  And we could also monitor the battery voltage, to warn of a dying system (solar floods have a habit of dying young).

So, for quick and easy ESP-01 nodes, either use a wall wart, or piggyback them onto larger systems that won't notice a 60ma load.

Saturday, August 15, 2015

Powering Wireless IoT Nodes

Many hobbyists entering the IoT arena assume they can write an application for an RF development board then "just add a battery" to magically permit it to run for months or years before a recharge or replacement is needed.  They are shocked when they see the batteries die in hours or days.

Supplying power to an IoT board requires understanding the big picture of power management.  Power management affects all aspects of system development, from design to deployment.

System Design Overview

Let's start by reviewing the basic system design and development process.

1. Overall design:
- What must the system do?
- What are the minimum required features?
- What are the "nice to have" features?
- What is our development budget?
- What is our per-node deployment budget?

2. Identify physical/hardware constraints.
- Does the system have to fit in a specified volume or have a specific shape?
- What hardware features are requiired? (RTC? UART? Radio? PWM? ADC?)
- Does the system have to use existing hardware, or is a new hardware design permitted?
- What is the power budget?
- Does the system have to use existing power sources?

3. Identify logical/software constraints.
- How much ROM/flash is available for application code storage?
- For use cases that require remote firmware update: How many copies of the application must be retained?
- How much RAM is available for application use?

4. Create a system design.
- Identify dependent requirements (requirements that depend on other requirements or specifications).
- Determine which requirements/specifications impose the greatest constraints (are the most difficult to meet).

5. Analyze the design.
- Identify and address any additional dependent (or inferred) requirements.
- Does the design meet all requirements?

6. Acquire hardware and write software.

That's only the most basic list.  Let's go through the process of designing a simple system, then calculate an estimate of power consumption, then evaluate one power option in detail.

Our Example System

Goal: Monitor a remote fence gate.

Target Hardware: ESP-01 ($5 ESP8266EX-based WiFi module)

Industrial/commercial projects typically will first specify the application in detail, then identify the most appropriate hardware to meet the needs of the application and the budget.  Hobbyists tend to already have some hardware, and want to do something useful with it.

Minimum Required Features:
1. Send a message when the gate opens.
2. Send an "I'm Still Here!" message at least once an hour.
3. The node is remote: No wired power, no wired communications.
4. If disposable batteries are used, they must last at least 3 months.

"Nice to Have" Features:
1. Send a message when a problem is detected (e.g., the batteries get low).
2. Send a message when the gate closes.
3. Send a message if the gate fails to close withing 10 minutes after being opened.
4. Provide doorbell support (switch and message)
5. Support OTA (Over The Air) firmware updates.

That's enough detail for now.  Notice that nothing much has been said about the power source.  That's because the power supply is a "dependent requirement", whose details depend on other system features.  There are other dependent requirements that will arise as we perform our analysis.

We will also ignore all "Nice to Have" features:  They can be added after we show that all the minimum system requirements can be met.  However, the design of the minimal system should not exclude or prevent adding any of the optional features.

ESP-01 Power Consumption

Let's look at ESP8266EX power use in detail.  Which means looking at the ESP8266EX datasheet, specifically Table 4 in Section 2.3 on Page 13.

First, it is important to realize the numbers in Table 4 should be viewed as "best case" values.  The ESP-01 values are higher for several reasons, including PCB layout choices and the overall circuit design (what loads are present on which pins).

In particular, we won't see a 10 ua current draw when in Deep Sleep.  We may be able to get down close to 100 ua (0.1 ma) absolute minimum power drain.

Also, Deep Sleep is a PITA, since it requires a full reboot to recover.  Which means when the CPU wakes, it has no idea what it was doing just prior to going to sleep.  The work-around is to save the program state prior to entering Deep Sleep, and restoring it upon waking.  The RTC (Real-Time Clock) module (which stays awake during Deep Sleep) appears to have some SRAM in it we can use for this purpose.

Before looking at the other low-power modes, it is important to understand why and when they are needed.  If you are operating from a wall wart, the wall wart itself could be wasting a fair amount of power anyway, so restricting the power use of a mains-powered ESP-01 may not buy you much overall power savings unless you have an extremely efficient wall wart.

That leaves non-mains (battery) power as the primary reason to try to minimize ESP-01 power use.  But before even thinking about using any particular sleep mode, it is critically important that the application itself isn't wasting any power whatsoever.  Let's look at Table 4 again.  The biggest use of power is transmitting: While the specification says transmit takes up to 170 ma, a quick online search will show that the ESP-01 actually uses 300ma.  So it would make great sense to bend over backwards to minimize the transmissions that are needed.
  
Transmitting more during each awake period will impact battery life, which means compromises will be needed in other parts of the system to regain satisfactory battery life, primarily by a) reducing the number of awake periods (increase the time between "I'm Here!" packets), and b) using the deepest available sleep mode.

WiFi Transmission

Minimizing WiFi transmissions is tough:  When a node boots, it must do a few handshakes with the AP (WiFi Access Point) before it is allowed to send data to the local network.  These handshakes involve basic access (SSID and Password), configuring encryption, using DHCP to get an IP address lease, and there may be more I'm not remembering at the moment.

So a good transmission conservation goal is to prevent having to repeat any part of the boot-up handshake when an ESP-01 wakes up to send data.  Fortunately, there are ways to do this!  But it isn't well documented, and I haven't yet done it myself with any of my ESP boards.

Next comes sending the data itself.  Pretty much the entire Internet uses TCP/IP, which is a Good Thing, since TCP provides "guaranteed delivery" (every packet sent is eventually received).  But that guarantee comes with some connection timeouts and overhead, which our battery-powered ESP-01 nodes should avoid.  Which means using UDP.  But that has its own problems: UDP is a "connectionless" protocol that has no packet delivery guarantees, so we must add our own minimal ACK protocol to let the node know it's data has been received and it can go back to sleep.

But for Home Automation and Security, we'd want our protocol to do a bit more than just "ACK".  Since WiFi encryption isn't all that secure, we should consider adding our own encryption layer (Highly Recommended!).  And the protocol should also protect against MITM (Man In The Middle) attacks as well as Packet Injection attacks.  Fortunately, once the link is encrypted, adding a pseudo-random packet serial number can provide these safeguards.  (I've just described some of the key features in the Thread and CoAP protocols, which both use DTLS.)

The last thing the protocol must do is support "NAK", which means the recipient got a packet, but it may have been corrupted in some way (possibly due to an attack), so a NAK tells the sender to do a retry. After sending a packet, the sending node must wait some pre-determined time for the ACK or NAK to arrive.  This wait is done in Receive mode, which uses much less power than Transmit mode (~60 ma, vs ~200 ma).

Whew!  Clearly, the CPU is going to be busy.  Fortunately, when the radio is off ("Modem Sleep") the CPU is still running, and it uses "only" 15ma.  So when there is lots of computation to do, it is best to do it with the radio off.  Or, conversely, turn the radio on only when needed, but be sure to wait the 3ms radio wakeup/warmup time before sending data.

The Gate Monitor

The only external input device connected to the ESP-01 is a switch that opens when the gate is opened.  When the gate opens, the ESP-01 wakes up to send a single packet, then waits up to 1 second for a reply. If the gate isn't opened for 1 hour, the ESP-01 will send an "I'm Still Here!" packet and wait for a reply.  In either case, when the ACK is received, the ESP-01 immediately goes back to "sleep" (the kind of sleep isn't yet defined).

To keep things as simple as possible, let's use 3 AA alkaline batteries in series, connected to a 3.3v LDO regulator.  Our batteries will be Eveready Energizer AA, which have this datasheet and a nominal 2500 mah rating.

Let's now let's define a "typical day" for our gate: If nothing at all happens, it will wake up 24 times to send "I'm Here!" packets.  Let's say the gate is opened an average of 4 times each day.  We can view each "open" event as an interruption (and restart) of the one hour "I'm Here!" cycle.  Skipping the basic statistics,  this means our typical day will see 26 packets sent.  And let's assume our transmissions never fail, and that we always get an ACK 100 ms after the packet is sent.

Estimating Power Use

When the ESP-01 wakes (either due to the gate switch or the timeout), it first will power on the radio, and during the 3ms warmup period it will prepare the packet to be sent.  Then the packet will be transmitted, which may take 3 ms.  Then the ESP-01 listens for the reply, which it sees 100 ms later.  After verifying the ACK, the ESP-01 powers off the radio and goes back to "sleep".

How long does this take?  Assuming the CPU wakes in zero time (which may not be the case), we have 3ms radio power up, then 3 ms transmit, then 100 ms listening.  We'll also assume it takes zero time to go to "sleep".  That's a total of 106 ms awake.  This happens 26 times a day (on average, for our scenario), which is 16 * 106 ms = 2756 ms.  That's less than 3 seconds awake for the entire day to monitor a gate!  Not bad.

But how much power does the awake time consume?

We need to calculate this from the perspective of the battery, not the ESP8266EX.  It is important to understand that the batteries start out at 1.5v*3 = 4.5v, and that value will decrease until the LDO cuts out at about 3.4v.  But how do we account for the LDO that sits between the batteries and the ESP8266EX?  I don't want to get into the details of linear regulator operation here, but suffice it to say that the LDO "wastes" the difference between the battery voltage and 3.3v, meaning the current is the same at the battery.

So, each wakeup has 3ms @ 70 ma (radio warmup, packet prep), 3 ms @ 300 ma (send packet), 100 ms @ 70 ma (wait for reply).  Our average current consumption during our 106 ms awake time is: (3*70 + 3*300 + 100 * 70)/106 = 77 ma.  Since we're measuring our battery capacity in mah (milliamp-hours), let's see how much of our battery we're consuming each day while awake: It's simply  77 ma * 106 ms = 8162 mas (milliamp-seconds), which we divide by 3600 to get 2.3 mah.

Now, let's figure out how much current we'd consume for each of our 3 sleep modes for the rest of the day.  First, how long will the ESP-01 be sleeping each day?  That would be: 24 hours - awake time = 24 hours - 2756 ms = 23 hours, 59 minutes, and 57.244 seconds, or 23.9992 hours.  To keep it simple, we'll just round to an even 24 hours.  (No, that's not cheating!)

Deep Sleep: 100 ua * 24 hrs = 2400 uah = 2.4 mah
Light Sleep:  1 ma * 24 hrs = 24 mah
Modem Sleep: 15 ma * 24 hrs = 360 mah

Notice that even Deep Sleep mode uses more power than the awake power. That's not to say that Deep Sleep is expensive, but rather it says that we have done a good job minimizing the power used while awake (can't do better than a single packet).

The next step is to figure out our total daily current consumption for each of the three scenarios. We just add the awake mah to the sleep mah for each sleep mode:

Deep Sleep = 2.3 mah + 2.4 mah = 4.7 mah/day
Light Sleep = 2.3 mah + 24 mah = 26.3 mah/day
Modem Sleep  = 2.3 mah + 360 mah = 362.3 mah/day

Now, let's figure out our battery life for each of the three sleep scenarios.  The "right" way to do this is to use the average current for each scenario and follow the "constant current" discharge curve in the battery datasheet.  But I'm feeling lazy, so we'll just assume a fixed battery capacity of 2500 mah.  Since the batteries are in series, the current is common to all three batteries, so it's still 2500 mah for the set (but at a starting voltage of 4.5 v instead of 1.5 v).

For each of our scenarios, we'll divide 2500 by the daily mah consumed to get the battery life in days:

Deep Sleep = 2500 mah / 4.7 mah/day = 532 days (A year and a half!)
Light Sleep = 2500 mah / 26.3 mah/day = 95 days (About 3 months)
Modem Sleep  = 2500 mah / 362.3 mah/day = 7 days (A week)

And, there, finally, at long last, you have it.  The effect of sleep mode on battery life for the absolute simplest-but-realistic ESP-01 application I can think of.  (It gets worse, perhaps much worse, if your ESP-01 needs to do anything remotely intelligent.)

What have we learned?  Consuming 3 batteries each week adds up, so even if we turn off the radio (but keep the CPU awake), we'll still want to use a wall wart.  Which in turn means that all battery-operated ESP-01 applications should plan on at least using Light Sleep, which should be easy to program (SRAM stays powered). 

Now. imagine that you had 10 ESP-01s running on batteries.  Then even Light Sleep would go through 120 AA batteries each year!

Using Deep Sleep will take some work to save persistent values before sleeping (since the CPU RAM is powered off), and restoring them upon waking.  And our 10 ESP-01s would  use just 20 AA batteries in a year, much more affordable.

Bottom line?  Deep Sleep is a Win.  We need the ESP-01 to fully support it, including a safe interface to the RTC SRAM.  Otherwise, the ESP-01 is truly useful ONLY when plugged in, or when connected to a battery system that costs much more than the ESP-01 itself.

Other Ways to Manage Power

The above analysis shows it is fairly difficult to use the ESP-01 with battery power for even the simplest wireless nodes.  Such applications should consider using a lower-power radio, such as BTLE, ZigBee, Z-Wave, ANT or 6LowPAN.

If cost is not an issue, we can improve the power supply.  Each of the following alternatives comes with its own issues to resolve:
- Use a switching regulator instead of an LDO linear regulator (more complex power analysis).
- Use rechargeable batteries (limited number of charge/discharge cycles, self-discharge and thermal issues).
- Use solar power (needs batteries at night, which also means a battery charger).
- Use an ultracapacitor (nearly infinite cycles, but large and expensive).

Of course, we also could try to eliminate the remote node itself, which would in turn eliminate all battery life issues. For example, we could point a webcam at the gate and use video analysis software to detect gate position and movement.  But doing so would make it impossible to add some of the "Nice to Have" features, such as the remote doorbell.

Next Steps

Given that the above does meet all our mandatory requirements, our next step would be to design the circuits to interface the gate switch to the ESP-01 and add the LDO regulator, after which we'd build an instance of the hardware.

Next, we'd develop the software (using whatever ESP8266 development environment you prefer) and make it work!

Finally, after all the mandatory features work, and we verify the system power use is very close to what we estimated, we'd next consider which optional features to add.

Sunday, July 19, 2015

Software I Always Install

When I get a new Windows 7/8.1 PC, the first thing I do is install some key free applications to "tame" it. To the greatest extent possible, I like to have similar tools available for all platforms, so I can use any available Windows/Linux/Mac/BSD system as a workstation without having to kill my workflow or productivity.

General:
  1. Firefox - Not necessarily the best browser, but it has the best plugins.
  2. NoScript - Makes browsing tolerable.  Blocks a bunch of evil.  Has a learning curve: RTFM.
  3. Classic Shell - Provides a Win7-style Start button/menu/taskbar under Win8.
  4. Clover -  Adds tabs to Windows Explorer
  5. Foxit PDF reader - Better than Adobe
  6. LibreOffice - Covers 99% of what I'd need from Microsoft.
  7. 7-Zip - The best all-platform archive handler.
Engineering Tools:
  1. Cygwin (32-bit) - A full GNU/Posix environment under Windows.
  2. MobaXterm - Multi-tab terminal and X11 environment. Better than CygwinX, includes Cygwin subset.
  3. Geany - My favorite lightweight IDE for all-platforms (even RasPi).
  4. TortoiseGit - Becoming the global revision control system of choice.
  5. TortoiseSVN - The prior RCS champ, still very popular.
  6. DesignSpark Mechanical and PCB - Free schematic entry, layout and MCAD.
  7. yEd Graph Editor - Faster and easier than Graphviz.
  8. Anaconda (64-bit) - Ginormous Python environment (all of SciPy and more).
  9. Node.js - Javascript is everywhere.  Deal with it.
Maintenance:
  1. SysInternals Suite - Everything needed to control Windows. 
  2. CCleaner - Even new systems need cleaning.
  3. EaseUS Todo Free - Backup & cloning toolset.
I'll update this post when I find new favorites.

There are several multi-platform numerical and symbolic analysis apps I used to install, but I'm trying to force myself to stay within the Scientific Python environment.  Still a bit of a learning curve, but it's coming along.  There are lots of reasons for this, but that's a separate post.

Fighting Bloatware

My new Asus laptop came with a ton of junk pre-installed, and even Windows 8.1 includes some.

The first thing I do is remove all games.  Not that I don't like playing occasionally, but many pre-installed games come with stuff you do not want on your computer.  The simplest thing to do is to delete them all, then selectively install the individual game apps you like.

Next, install the full SysInternals Suite.  Run AutoRuns, select the Everything tab, and take a look at what your system is firing at boot and/or login.  Uncheck the obviously bloatful programs, then do web searches on the ones that look suspicious.  Now, this won't remove the offending program from disk, but it will keep it from running at startup or login.

Sometimes a useful package installs bloatware: Uninstalling the entire package may not be practical, but the above procedure keeps it's junkware from running, which means only a bit of disk space is being wasted.

Saturday, July 18, 2015

Why I chose an i3 laptop.

What?  I purposely got an i3 laptop?  Not the expected i7, or even an i5?

Yes.  Yes I did.  Here's why:

Most of what I do on a PC (browsing, IDEs) rarely taxes the CPU.  When I went shopping for a new laptop, I wanted to emphasize creature comforts (decent display, usable keyboard & trackpad) over performance (storage, memory and CPU).

When Fry's had a sale on the Asus Transformer Book Flip (TP500L) I snapped it up, despite it only having an i3-4030U with 4GB DRAM and a slow 500GB disk.

Having a touchscreen combined with the keyboard able to flip to make a stand or all the way backwards to make a huge tablet was a plus, since I will often use the system as the interface for various USB instruments (o'scope, logic analyzer, signal generator, device programmers, etc.), and being able to get the keyboard out of the way saves lab bench space.

Even more important is that I can stand it up sideways (profile display) to view documentation at the workbench, scrolling with the touch of a finger, using the on-screen keyboard for the occasional text search. Think about all the paper that will save, especially since my PDF viewer (Foxit) supports annotation.

Finally, a modern i3 provides maximum battery life and minimum overheating concerns.  But, not surprisingly, the system was dead-dog slow out of the box.  No surprise there, but I had plans.

Today Fry's had a 240GB SSD on sale, which I installed with about 20 minutes of mechanical work and an hour of copying time using EaseUS Todo Backup Free, which also handles partition resizing.

My total investment so far is still less than even a low-end i5 system without the flip screen capability.  And the performance with the SSD makes it feel like it's punching way above its i3 weight-class.

Now it cold-boots in about 4 seconds, and programs load much faster.  Browsing is also quicker due to the rapid disk cache updates.  Software installs are also swifter.

But best of all is the quick swap speed, which will postpone a DRAM upgrade until I need to install truly massive tools, such as the Xilinx FPGA development toolchain.

The moral of the story?  If you have an unlimited budget, get an unlimited laptop.  Otherwise, subtract the cost of an SSD from your budget, then get the best laptop you can with what's left.

What's this blog about?

This will be a collection of miscellaneous items that I've encountered at work, home or in my hobbies.

Some will be tricks related to the main computing platforms I use (Win7/8.1, Cygwin, Ubuntu, RedHat) and the programming languages I use (primarily Python and C/C++).   Others will concern my hobby work in IoT and HA (Home Automation).

I also expect to cover lots and lots of stuff about my life and work as an embedded/real-time software engineer working on systems ranging from 8-bit processors programmed to the bare-metal to multi-core 32-bit systems running various forms of Embedded Linux, lightly spiced with some occasional DSP work.

But I suspect most posts will concern things I've encountered online that I want to share and not forget about.