Open Source and Hardware Engine Management Systems (ECUs)

I recently fell down a rabbit hole of car modding videos on YouTube. Most/all the videos I watch involve engine modifications, and there inevitably comes a time where the car gets a new tune on a Dyno in order to realize the potential afforded by the mods.

Tuning a modern car involves tweaking parameters in the engine management system, or engine control unity (ECU). Budget builds often make use of the OEM ECU that originally came with the engine (which may or may be the engine originally installed in the chassis) but the addition of forced induction via a turbo or supercharger to a normally aspirated engine usually calls for an aftermarket ECU.

Aftermarket ECUs offer another advantage for older vehicles; by using existing engine sensors, and perhaps adding a few more, a lot of engine control functions can be consolidated in the ECU, rendering various vacuum controls and mechanical linkages unnecessary. This can help clear space in a crowded engine bay, and also improve reliability, serviceability and, potentially performance, fuel economy and emissions.

From my research, tuning an OEM ECU may involve hundreds of dollars in software and hardware. A modern aftermarket unit suitable for upgrading a four cylinder engine starts at about $850, and one can spend more than twice that for an advanced feature set and the ability to run sequential ignition and fuel injection on a six or eight cylinder engine.

The prices of aftermarket ECUs aren’t outrageous compared to the cost of parts or labor for a big project, but they start looking more substantial if you plan to do most of the work yourself while scavenging parts as cheaply as possible. This got me wondering about whether there was a community of people either developing open source firmware for common OEM ECUs, or perhaps custom hardware.

It didn’t take too much looking to find two active projects with healthy communities around them, Speeduino and rusEFI.

Speeduino has been around since X. In 2017 it was a finalist for the hack a day prize. The author, Josh Stewart started out by using Arduino based hardware to run a lawnmower engine. By now it’s been used to run a variety of engines with 4-8 cylinders. The hardware is based on an Arduino Mega. It adds a robust automotive power supply, protection for the I/O channels and driver circuitry suitable for ignition coils, fuel injectors and other components.

A board capable of sequential fuel injection and ignition on 4 cylinder engines is available assembled for under $200. A unit that is plug-compatible with the ECU on a first generation Miata is available for ~$250, including an external housing.

Interestingly (to me), the Speeduino firmware takes advantage of the Arduino build environment and some libraries. This has enabled people to port the firmware to more capable ARM-based Arduino-like devices, like the Teensy. These ARM based platforms afford the possibility of more advanced peripherals, like CANbus controllers, more memory for data logging, and more headroom, allowing things like unlimited software timers to replace a limited number of hardware timers on the ATMEGA.

I think RusEFI has been around since, at least, 2013. There are already some great hardware options. $270 gets you a unit with a robust waterproof case and someone is gearing up to sell a board capable of running sequential ignition and fuel injection on a V-12 engine.

Both systems (currently?) rely on 3rd-party commercial software for the tuning process, but the software, Tuner Studio, is available for less than $100.

I could do a much longer post, but this post is long enough, so, this is where it ends.

CN3791 MPPT Solar Li-Ion Charger Module Hinky Circuit.

Last year, I paid about $3.66, with shipping, for this solar-powered MPPT lithium ion battery charging module on eBay to use with my small solar panels and scavenged 18650 batteries. It has some issues.

First off, the version I purchased/received is intended for 9v solar panels and I wanted to use it with a ~6v panel. This is set with a resistor divider. Careful study of photos from product listings showed that the divider was implemented using the same resistor value for the high segment of the divider, changing only the value of the lower segment’s resistor to change the setpoint.

The high segment had a value of 178KOhm and the low ranged from ~42KOhm for a 6v panel down to 12.6KOhm for an 18V panel. I didn’t have any SMD resistors of suitable value in my supplies, and I couldn’t find any I could scavenge on any surplus PCBs. I decided to use a trimpot instead. I had a variety on hand, and it would allow me to experiment on the optimal clamping voltage for the panel I had on hand, and an 18V panel I’d ordered. I chose a 200KOhm trim pot with the idea that approximating the total resistance of the existing divider would help preserve the stability of the control loop. If I were going to do it again, I’d probably choose a different configuration to minimize the impact of the pot’s temperature sensitivity. A simple choice would be ~20KOhm trimpot, configured as a variable resistor (short the wiper to one terminal) used it to replace the low segment, leaving the 178KOhm resistor in place.

After adding the potentiometer, I connected the battery and panel and adjusted the potentiometer until I maximized the charging current. I was a little surprised by how low the panel voltage was, and so I started poking around. The first thing I checked was the voltage drop across a P-Channel MOSFET on the panel input. I was surprised to find that it was 500mV, though knowing that, I wasn’t surprised the IC was noticeably warm. The panel was dissipating 1/10th of the panel voltage over the MOSFET!

Some of the photos on some of the product listings showed a simpler circuit, without anything in the panel input current path. My guess is that the MOSFET and accompanying resistor and diode were added in a revision in order to protect the circuit in case the panel polarity was accidentally reversed, and/or to block leakage of charge from battery through panel at night. A schottky diode would accomplish the same thing more simply, but with a voltage drop of ~300mV. Properly implemented, a MOSFET based “ideal diode” would have an effective resistance of ≥ 50mOhm, and a voltage drop of ≥ 50mV at the ~1A max current my panel could deliver.

I’m not completely sure how the circuit was intended to work, but clearly, it wasn’t doing the job. I wondered if it would work properly if I was using the module with a 9V manual, as intended, but that didn’t seem possible, either. The panel + was connected to the MOSFET’s source, the rest of the circuit to the drain, and the gate was connected to the drain via a resistor and diode. By my reasoning:

  • that the gate would ≅the potential of the drain
  • the voltage drop from source to drain should be as close to 0V as possible in order to maintain the efficiency of the curcuit
  • therefore, Vgs would/should approximate 0V
  • but it won’t because the Vgs threshold for the MOSFET was ~2V!

I wasn’t sure how to fix the circuit, but I was sure that the gate needed to be pulled down to a lower voltage, so I cut the trace connecting the resistor the drain and connected it to ground instead. It worked well enough that the voltage drop over the input MOSFET went from 0.5V to a trivial number. I’m pretty sure though that I didn’t fix the protection function.

I’ve since received another version of the module which has revised the input circuit. The diode and parallel resistor connecting the gate and drain are still used, but there as another resistor which connects to the charging indication pin on the CN3791, and in so doing. This pin is open drain. When the battery is charging, it is pulled low, lighting the charge indicator LED AND pulling the input MOSFET gate low. Vgs ≅ -Vpanel ≅ Vs ≅-6V, turning the MOSFET fully on.

Thinking through this further… if the battery is charged and the panel is illuminated the gate will approximate the potential of the input MOSFET drain and, since the only load on the panel is the quiescent current of the module, then Vsd ≅ 0V ≅ Vgs and so the MOSFET will be off, save any current through the body diode.

If the panel is dark and the battery is charged then Vd of the input MOSFET will, at most, be at battery voltage (Vbatt), Vs will be ~0v, Vg will ≅ Vd, Vgs ≅ Vd and the input MOSFET will be off.

If the panel is reversed Vs will be below GND and well below Vg ≅ Vd ≅ Vbatt so Vgs will be Vbatt + Vpanel, and the MOSFET will be off. Note: This means that reverse polarity with an ~18V nominal panel would exceed the Vgs maximum of 20V for the TPC8107 MOSFET used at the input.

If I get around to it I’ll draw a schematic and add it to this post.

Balight 21W Folding Solar Panel USB Charger Partial Teardown

I picked up a 21W, 3-panel Balight folding solar panel-based USB charger from Amazon for ~$36 a couple of weeks back. It uses high-efficiency SunPower Maxeon cells much like similar 20-21W panels from AukeyAnker and dozens of obscure brands. All of them have the same basic construction. They are all made from nylon ballistic cloth. Each fold has a panel made from two SunPower cells encapsulated in a flexible waterpoof sheet. The panels provide power via two 5v USB ports, which presumably have some sort of voltage regulator.

I wanted to know more about how the chargers worked. In particular, I wanted to know if they were wired in series, or parallel because I wondered if it was worth trying to tap into the raw output, before the USB regulator to reduce power conversion and resistive losses for some applications.

I thought I’d be able to get the information I needed by finding someone documenting a teardown of their own panel on YouTube or a blog post. Despite the dozens of variants from dozens of brands and a handful of manufactures though, I didn’t find what I was looking for.

So, I decided to dig up a seam ripper and open my panel far enough to get a look at the wiring, and tap in to it upstream of the voltage regulator.

The panels appear to be wired together with some sort of woven wire conductor. I had some hope that all the cells would be wired in series, to give a nominal panel voltage of 18v. Based on what I could see, and measuring the voltage before the regulator in full sun, it looks like each panel is wired in series, for 6v nominal voltage, and then the panels are wired together in parallel. I was disappointed at first, but this arrangement makes sense in upon further thought.

Using a 2s3p configuration means that the input voltage into the switching regulator should be pretty close to the 5v (actually, 5.2v with enough sun and a light enough load) output of the USB power regulator, which will typically have higher conversion efficiency than 12 or 18 volts. It also means that the manufacturers can stock one converter for everything from a 7W single-panel charger, up to a 28w 4 panel charger without the converter having to support a wide range of input voltages. Perhaps most importantly, it means that partial shading of one panel shouldn’t have a disproportionate impact on the power output of the entire array.

The only downside is that resistive losses in the cabling will be higher with lower voltage and higher current, but that the interconnects aren’t more than a foot or so, the resistive losses shouldn’t be too high.

As for the converter itself, I may look at it more closely and add some more details, but, a few initial observations:

  • The PCB design has extensive ground planes on top and bottom, tied together with vias.
  • Both outputs are served from a single buck-converter (step-down) power supply based on a Techcode TD1583, which is a 380 KHz fixed frequency monolithic step down switch mode regulator with a built in internal Power MOSFET.
  • It looks like only port 1, at the top right in my photo, has the data lines connected, which suggests that it is the only one with fast-charge coding.
  • IC U2 looks like it has its markings sanded off. I notice though that one of its pins is connected to the enable pin on the TD1583, leading me to think that it is responsible for cycling the output to make sure devices draw as much power as possible when the panel voltage rises again after clouds or an object reducing the light falling on the array pass. I don’t know if it is a MCU, some sort of timer, or comparator, or what, though.

There you go. I can’t be sure that other folding solar arrays like this one are wired in the same way, but if they only support a 5v output, I suspect they will be. I hope this proves useful to someone besides me.

New to Me: EDC 521 DC Voltage/Current Source

Last week I came across a miscategorized eBay listing for an Electronic Development Corp (EDC, now owned by Krohn-Hite) 521 DC Voltage/Current Source. It was listed in the network equipment section, with “Juniper” as the manufacturer.

The EDC 521 is a precision DC reference source with high accuracy, precision and stability, for the calibration of meters and sensors. It can output voltage in three ranges, (0-100mv, 0-10v, and 0-100v), and constant current in two ranges, 10mA and 100mA (with compliance voltages up to 100V). In each range, the precision/resolution of adjustment is 1ppm. Overall stability in Voltage mode, within the devices operating temperature range is 7.5ppm over 8 hours, 10ppm over 24 hours, 15ppm over 90 days, and 20ppm over a year. The temperature coefficient (which is included in the above estimates). It is microprocessor controlled and has a GPIB interface to allow remote control.

To achieve its basic stability, it uses an aged and selected 1N829 temperature compensated Zener diode as its primary voltage reference. This diode is driven by a stable precision current source at a current chosen to provide the best combination of temperature stability, long-term drift and low-noise for the individual diode used in each unit. Adjustments are made using a custom, precision 24-bit digital to analog converter.

Voltage divider resistors and 1N829a temperature compensated zener voltage reference.

The DAC works by feeding the reference voltage across a resistor divider to obtain 10 output voltages, tapped at 500mV intervals. If I understand correctly, these voltages are switched to provide analog voltages for each decade, these voltages are buffered, then then weighted and summed using some precision resistors before being fed to the output amplifier.

When the package arrived yesterday, I saw why the listing had been miscategorized — it was packed in a box for a Juniper Networks switch. That, and the sticker noting a failed calibration attempt in 2009 makes me doubt the seller’s assertion that it was “pulled from a working environment.” Not that I expected a pristine, calibrated instrument for $150.


Inside the box, I found things in a bit worse physical shape than I expected. What I thought was shadow/glare in the photo from the ebay listing, was actually a torn red filter over the LED display. And the underside of the case, which wasn’t pictured in the listing, had a huge dent.


On closer inspection, the dent didn’t reach the PCB inside, and I was able to remove the panel and hammer it out. Once inside, I found that everything had a fine coating of persistent dust. Hitting it with canned air shook some of it loose, but most of it remained.

So, I got to work rinsing it with a lot of isopropyl alcohol which I then chased off the edge of the board with canned air. After a few repetitions, the top and bottom side of the board were pretty clean. I then looked over both sides of the board closely, looking for damaged components, and cleaning out little pockets of residue.

I didn’t see any damaged components, but along the way noticed signs that the board had received some major revisions. There was an obvious bodge wire on the bottom of the PCB, but it was also clear that new holes had been drilled to receive additional components. On the top side, I found a cut trace, along with a couple of added resistors and a couple of capacitors. I haven’t traced everything out, but its obvious that the bodge wire connects to one end of the internal reference divider, and the rest of it is on the opposite end, so it would seem likely that its helping isolate the reference divider, and the voltages it produces, from noise sources.

It also appears that a number of power transistors have been replaced. Unfortunately, none of the components in question have obvious date codes, so its hard to guess when the modifications were done, and whether the transistors and the filters were added at the same time. Perhaps one of you knows how to decode the markings?  First line is a Motorola logo followed by “616,” the next line is “JE350,” which is the model/part number. The datecodes on other components pretty much all date to late 1996, and the MPU board has a label with the firmware revision and is dated January 1997.

Before closing it up, I took care of the loose plastic supports for the back-edge of the PCB, which holds heavy electrolytic filter caps for the power supply. I cleaned the old, crusty, failed double-sided foam tape off and replaced it with new tape so I could stick the supports to the back of the chassis again.

I powered it up, and gave it a quick check on all the voltage and current ranges. It seems pretty close to its 1 year tolerances. I was surprised by the amount of time it took to warm up and stabilize, but when I checked the manual, I saw that the warm up time is speced at 2 hours.

IMG_8648 IMG_8647

I powered it down over night. This morning I set up my computer to voltage readings ever few seconds and then powered it back up. I’ll post a graph once I have a days worth of data. After that, I’m going to write a script to run through all the possible settings and log the measurements. So, more to come!

A Trip to the Museum of Communications

This morning, I woke up with an itch, an itch to see a switch.

Version 2

No, not that kind of switch, something bigger!

Battery Reserve Switch

Nah, that’s not a switch…

Panel Switch

That’s a switch! Well, part of one.

Museum of CommunicationsIt is part of a panel telephone switch, one of a number of operational telephone switches at The Herbert H Warrick Jr. Museum of Communications, a little known technological treasure trove in the Georgetown neighborhood of Seattle. The museum fills the top two floors of a Century Link central office building.

The museum’s collection includes all manner of equipment and memorabilia from over a century of telephone history. The presentation of the collection is a bit uneven. There are carefully dated and labeled exhibits of phones and other equipment, along with display cases stuffed with lineman’s tools, but to me, that’s all secondary.

The best part of the museum is that it houses multiple generations of telephone exchange switching equipment, the sort of stuff that used to fill small buildings and connect thousands of homes to the telephone network. Much of it of it is operational, and interconnected, and attended by a staff of volunteers, many of them technicians and engineers retired after long careers with Ma Bell and her successors. They answer questions, give tours, and maintain the equipment.

The automatic switching equipment spans almost a century. The oldest automatic switch is a panel switch that served the Rainier Valley. It was installed in the 1920s and served for over 50 years. Unlike that later automatic switches in the museum, which were made in a factory and then installed, the panel switch was assembled on site at the central office and moving it five decades required removing walls. They also have a #1 Crossbar Switch (#1XB) from the 1930s, and a #5 Crossbar (#5XB) from the 1950s. The panel and crossbar switches are all operational. Calls can be placed between lines serviced by the switches and you can hear the progress of the call set-up and tear-down sound across the racks as the various electromechanical parts do their thing. They also have a number of operational PBX systems, some old-time switchboards, some small Strowger step-by-step switches and a not yet operational #3ESS, a small variant of the first switches using transistorized logic for control.

The collection also includes inside plant, like power distribution equipment, outside plant, like cables, along with a variety of trunking and long-distance equipment, including equipment for carrying national network television broadcasts.

There is also test equipment spanning decades, and a nice cache of HAM radio equipment.

The museum was originally called the Vintage Telephone Equipment Museum when it was created in 1986 by Herbert H. Warrick Jr. Warrick was an engineering director at Pacific Northwest Bell, and started the museum with the company’s support to preserve generations of vintage telephone equipment that was being phased out in the transition to digital switching and transmission. More recently, it became affiliated with the Telecomunications History Group.

I’ve made many visits to the over the years, and each time, I learn something new. I recommend it to anyone with an affinity for technology, particularly communications and computing, but it should also interest anyone curious about industrial and economic history. I hope I’ve whetted your appetite.


HP 6177C DC Current Source Troubleshooting/Repair

I picked up a Hewlett Packard 6177C DC Current Source on ebay for less than $75 shipped. This is a precision constant-current source that can deliver 0-500mA at up to 50V.

IMG_7318The seller described the unit as used with responsive controls and indicators. When I received it, I could see that while in generally good physical shape the upper right portion of the front panel was more bent/buckled than I could make out in the eBay photos.

So, first thing I did was partially disassemble the unit to fix the front panel.

Once I got it back together, I did some quick functional tests and found that the current output was consistently 1/10th the expected value. In the 500mA range with the current pot set to maximum, it produces a max of 53mA of current, on the 50mA range, it produces 5.3mA, and on the 5mA range, 0.53mA. This behavior doesn’t vary noticeably between shorting the outputs and having a 30 Ohm load. With a suitably high resistance, the voltage will hit >50v, provided the current doesn’t exceed ~50mA.

So, next step was to look at the service manual and work through the troubleshooting steps.

First thing is to check some voltage rails.  These all checked out, though a few were out of spec on ripple.

Next is to go through the problem isolation procedure, which starts with checking the guard voltage to see if it varies between 0 and -1V. Nope! In each range it maxes out at… ~100mV, or 1/10th of the expected value. Notice a pattern forming?

I started to work through the guard supply troubleshooting instructions, but I got hung up. After disabling the main supply, as instructed and checking a few voltages, it wasn’t clear to me whether I should go immediately through the subsequent steps, or reverse the change and proceed from there. Subsequent instructions just raised more questions.

I asked for guidance in the EEVBlog forum, and while waiting for a response, worked to better acquaint myself with the schematic and theory of operation of the device.

I’m still not sure what to do, and rather than pushing forward, I realize that I already have other incomplete projects that need my attention, I’ve gathered everything up into a bin and put this one on the shelf, for now.


Fish8840 AVR Transistor Tester Review

Today, I’m looking at a neat gadget I got on ebay for about $20 called the “Big 12864 LCD Transistor Tester Capacitance ESR Meter Diode Triode MOS NPN LCR.”

There are hundreds of listing for dozens of variations of these under different names, for prices ranging from ~$12-40.  Most, if not all of them, are made in china. Most, if not all of them, are descended from the AVR Transistor Tester project by Markus Frejek (or google translated), with further improvements by Karl-Heinz Kübbeler (or google translated). Unforunately, none of the Chinese clones honor the projects license and release source-code for their firmware modifications. Fortunately, people are figuring out the hardware differences on some of them, and adding support for to the open source project. The english language documentation for the project is great. It actually includes information on some of the chineese clones. Even better, the design and documentation are a great example for learning how to make good use of the hardware on an AVR MCU.

The Fish8840 version I have, which has a PCB date of 2014-07, has stupid bug in the power-management circuitry which causes it to have excessive current drain when it is supposed to be “off.” This video review by George Thomas of includes a simple modification that fixes the problem.

I didn’t really love this one. In addition to the flaw described above, some of the graphics are hard to read. Plus, there are rumors that the hardware is locked to block installation of different firmware.

For more information:

Still Shopping for my first oscilloscope, but it’s selection time!

I recently decided to buy my first oscilloscope to help with the process of learning more about electronics. The process of selecting an oscilloscope has been longer than expected. I started by looking at the scopes that Sparkfun and Adafruit offered, which lead me to wonder what other options were out there. I was a bit overwhelmed at the variety of models and manufactures, but managed to cut through a lot of the noise.

In this post, I’ll talk more about how I narrowed things down further, and what I ended up choosing.

To start, its worth covering why I wanted an oscilloscope in the first place. It all stems from a growing interest in the declining costs of capable “systems on a chip,” low power communication technologies like Bluetooth LE, and even WiFi, and the accessibility of the Chinese electronics manufacturing supply chain. Taken together, it seemed to me that I should learn enough about electronics to have a sense of the possibilities and limitations, and to be better able to collaborate with people with deeper expertise. This approach served me well in the past when I worked on software projects; I wasn’t a programmer, but I understood enough to be a good collaborator.

In looking around for learning projects, I decided to explore the world of power sources for these devices with a new site called Power Cartel. Part of what I’m doing on Power Cartel is doing teardowns of battery packs and chargers, with the goal of collaboratively creating useful opensource designs. In order to dive deeper in my teardowns, and support my own design efforts, I need to be able to start exploring what is happening inside the circuits and understanding how the different components interact.

There are a lot of test instruments. I already have a basic multimeter, but I felt like I needed a way to look at signals over time. There are two instruments that fit that job description. A data logger or chart recorder is useful for looking at signals over long periods of time (ie hours), where as an oscilloscope is good for looking at signals over much shorter periods, seconds down to micro, or even nanoseconds. I actually want to look at signals over both time periods. I want to look at current and voltage for charging and discharging batteries over the period of hours or days, but I also want to look at sub-second changes in signals.

I decided to start with an oscilloscope, rather than a data logger, because the sub-second changes are more fundamental. With a better understanding of such things, I could build my own data loggers. Moreover, most modern digital storage oscilloscopes can actually record signal changes over longer periods of time, and they can be connected to a computer for control and data recording.

I originally thought that some sort of oscilloscope module that I could connect to my computer or iPad would be a good place to start because it would save me money and space. I quickly learned there were problems with that approach. Most electrical engineers and technicians are trained on traditional stand-alone oscilloscopes, and training aside, traditional stand-alone oscillosopes are often easier/faster to work with because they have dials and buttons arranged in a user interface that has seen steady improvement for close to a century. Connected scopes are newer, and more of a niche item, so their software is less refined. More importantly, because they are a niche item, there is less competition and scale to push prices down, so any savings that might come from omitting a screen and controls is offset. As a result, USB scopes are not any cheaper than stand-alone scopes with otherwise similar specifications. The space savings was still attractive, but I decided I was going to purchase a stand-alone scope.

With that decision made, I had to figure the other key specifications for my work. For any scope, whether an older analog scope, or a more modern digital storage scope, bandwidth is a key consideration. Bandwidth determines the range of signal frequencies you can measure with the scope. Adafruit sells scopes with 50MHz and 100MHz bandwidth. Sparkfun’s stand-alone scope offering is 100MHz. From a little reading, the switch mode power supplies I’m going to be working with typically operate in the range of 200KHz to 2MHz, and the microcontrollers I’m working with operate at 4-16MHz, or perhaps 50-70MHz. Some of the wired communications protocols I’m using may be 5MHz.  It would seem then that a 50-100MHz scope would cover almost anything I’m likely to use it for in the near future.

Closely related to bandwidth is the sampling rate. Sampling rate is the number of times per second a signal is read. For a pure sine wave, an accurate estimate of frequency and amplitude requires a sampling rate that is at least two times higher than that of the signal being measured. One of the big reasons for having an oscilloscope though is to look at signals that are not pure sine waves. A lot of digital communications use square   wave signals, and a lot of signals, whether square, sign, step, pulse, or something else, can get distorted by characteristics of the circuits they travel in. A rule of thumb I’ve come across is that the sample rate should be 4x the frequency of the signal you are observing.

Almost all the scopes in my price-range ($400-650) have a 1GHz max sample rate. That may sound like overkill for a 100MHz scope, but in this segment of the market, that sample rate is across all the channels of the scope. So, if you have a scope with two channels, and you are using both of them, thats a 500MHz rate per channel, which is closer to the rule of thumb.

The next thing I considered was the number of channels. Each signal you measure requires a channel. Often you are comparing two signals against each other some how, and so, not surprisingly, most scopes have at least two channels, and, indeed, in this entry level segment, most don’t have more than two channels. I found a few with four channels, though. Most were out of my price range, but one, the Rigol DS1074Z was available for under $600. A two channel scope would probably do everything I reasonably needed, but the option of having four channels was intriguing. Four channels would allow me to look at voltage and current at the same time for both the input and output of a power supply. Even better, I could use some of the channels as a basic logic analyzer, and look at how analog signals changed in relationship to specific digital signals.

The consideration of the number of channels also turned my attention to scopes with a logic analyzer option to look at even more digital signals. Oscilloscopes with this feature are generally called Mixed Signal Oscilloscopes, which add 8 or 16 logic channels. The added digital channels come with an added cost. There is a $600 mixed-signal scope from Rigol (DS1052D), but it only has two analog channels, 50MHz bandwidth, and 1Mpts of memory. The Rigol MSO10747Z, which is the mixed signal version of the DS1074Z is $250 more expensive, and out of my price range. Since people seemed less bothered by USB logic analyzers, and since a cheap one could be had or $20 or so, I decided that I didn’t need a full logic analyzer.

Another consideration is the amount of memory available to hold samples. The Rigol DS1074Z has memory for twelve million sample points (12Mpts). Some of the alternatives I considered, like the Siglent SDS1102CML only has 2Mpts of memory, and the Siglent SDS1074CFL only has 24Kpts.

By now, you’ve probably figured out the DS1074Z is the scope I’m leaning towards. I took repeated looks at the other options from Rigol and Siglent, but kept coming back to the DS1074Z.



  • Siglent SDS1072CML for $319: +lower price -2 channels -memory
  • Rigol DS1052E for $329 +lower price -50MHz -2ch -smaller screen – older
  • Siglent SDS1102CML & SDS1102CNL ~$360 +lower price +100MHz -2ch -smaller memory -can’t tell how they differ from each other, other than memory
  • Gatten GA1102CAL $400 +lower price +100MHz -2ch -smaller memory

I would have liked spending less, but I wasn’t willing to go with a last generation Rigol, or forgo 2 channels and a bunch of sample memory just to save $150 or so


  • Siglent SDS1202CNL+ $546 +200MHz, 2Gigasamples/s -2ch -memory
  • Rigol DS1052D $610 +16 channel logic analyzer, -2ch -memory

The higher bandwidth and sample rate of the Siglent just didn’t seem that compelling, and nor did spending another $60 for the 16-channel logic analyzer on a last generation instrument.

More Expensive

  • Siglent SDS1074CFL $723 -higher price +2GSa/s, =4ch, – memory
  • Siglent SDS2072 $805 -higher price, +2GSa/s +memory -2ch +larger screen
  • Rigol DS1074Z-S $818 +signal generator
  • Rigol DS1104Z $830 +100MHz bandwidth
  • Rigol MSO1074Z $835 +16 channel logic analyzer
  • Rigol DS2072A $839 -higher price -2ch +2GSa/s

The price was really enough to knock all of these out. The Siglent SDS1074CFL was tempting since it had 4 channels and a higher sample rate, but the higher sample rate isn’t that important to me at this point, and so not worth the extra $175 or so. The only other one I gave serious consideration to was the MSO1074Z, but the integrated logic analyzer just didn’t seem compelling enough to drop another $280 or so.


In the end, the Rigol DS1074Z won me over with its combination of price, four channels and deep memory, along with the fact that Rigol seems to be a well-understood quantity at this point. This model has been out for almost a year, most of the bugs are known, and many have already been address. There are lots of good in depth reviews and tutorials for Rigol scopes. Certainly more than I saw for Siglent/Atten/Gatten or Owon.

The Rigol has another thing in its favor. It apparently shares the same hardware as the DS1104Z, and people have figured out a way to unlock the higher bandwidth. They’ve also figured out how to unlock some otherwise extra cost after market options, most of which don’t interest me, but its nice to have the option.

I ended up ordering my scope from, and paid less than $585 with free shipping thanks to a discount they offer members of EEVblog.