Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "I'm a Solaris user which is not well supported by the OSS toolchains. I'd like to have a dedicated Linux based dev system which has good support for ARM, MSP430 and other MCU lines and draws very little (5-10 watts max) power. The Beaglebone Black has been suggested. Is there a better choice? This would only be used for software development and testing for embedded systems."

I'd hesitate calling a quadcore intel system 'incredibly cheap' - but that aside.

In some cases, power can be rather more costly than that.For example, I did some simulations using accurate local solar data here in Scotland, and if I want a system that works 24*7, with storage to back it up, it comes out to around $200/W initial capital, and maybe $20/W ongoing (battery replacement), or $.90/kWh equivalent.Assuming a 10 year life, that doubles it to $1.8/kWh.Or $15 per watt your device uses, per year, amorti

It depends on your application space. If you're making a monitor to alert you when plants need to be watered, you're going to want to use a controller that can run on AA batteries for months, and costs a few dollars. That's not anything that Intel sells - that's more like an Arduino Atemel chip. So yes, compared to a high end CPU, a low end Intel CPU is cheap and low power, but compared to a $3 controller that can run on an AA battery for months, it's expensive and power hungry.

The problem domain isn't "monitoring plants with only an AA battery for power" though. It's a host capable of running Linux and assorted toolchains for embedded software development. I know that there are systems for all kinds of applications and the power envelopes they prescribe. That's not the issue. The issue is that someone proclaimed that "electricity is expensive. Douche bag." And that's not true. Electricity is cheap. Cheaper than hardware which is incredibly cheap itself. And that holds true even for the very low end, where a tiny amount of energy is sold in an expensive package, but a finished system will still cost more than the batteries it takes to run it for a couple of years before it is replaced, yet the hardware is sold at prices which almost make the devices disposable. For perspective: How many smartphones does the average person buy per year? Still think a low to mid range quad core desktop system is expensive?

So I do development and I want to cut my costs down. I presently have a desktop at home with a 600W supply and a server with a 250W supply, as well as laptops with 60-90W supplies. As I use the desktop for other things (e.g playing DVDs, Netflix, etc.) I'm satisfied to leave it for now; laptops might get replaced by tablets or chromebooks.

But the server? I keep it on a UPS, and would love to be able to keep it up for a very long time. On the UPS it would only get between 10-30 minutes if power fails (APC

not really. he's developing for mcu's. not for the system itself, so that has no point.

for it to be portable, it would need to have some other stuff.. like a monitor, keyboard and other stuff all which take power. now he might be doing it under solar power at his cottage or yacht or whatever.. but then kind of still would be needing the monitor and other peripheals.

basically the answer is just buy a friggin netbook. even if you need to hav

One of the cool things about the Beaglebone Black and the Raspberry Pi is that they've got GPUs powerful enough to drive an HDMI display, and give you 1080p graphics if you make sure there's enough electric power and not too much interference (my RPi was a bit wonky on the last display I tried), so you can drive a decent monitor for programming or use it as a TV video player.

But if you don't need that, because you're doing X windows or just doing a bunch of ssh terminal sessions, you've got more potential c

It's my understanding that "install a bunch of gnu tools" is the first thing that many Solaris sysadmins do on a new system.

Anyway, why do you need a low-power ARM system? The description heading mentions "embedded", but your description mentions irrelevant stuff like Solaris, but not the important stuff like what sort of embedded work you'll be doing: industrial control, point-of-sale, sensor monitoring, etc, etc ad nauseum.

Simple native development can be a lot easier than cross development.If you have the money for some really good embedded tools, cross development is not bad at all. But if not native development is a lot simpler.I would still do most of my work on an X86 Linux box and then move the project over to the embedded for testing but that is just me.

Its not really that much more complicated. I do cross dev at work, but at home I had a crossdev setup for a handheld gaming machine we were porting linux to up and running in under 30min. Its really not that hard to just specify a target on a different host. This is all gcc as well, you dont need money for good embedded tools

Probably because about 99.99% of questions such as this one play out like this:

"I need a hammer. What is a good hammer?"

"Why do you need a hammer?"

"I need a hammer to chop down trees."

"No, you need an axe."

They don't even allow questions like this on stackexchange because they're so open ended and worthless that they serve no purpose and provide no value (other than to instigate arguments such as this or flameboy arguments such as Home Depot hammers versus Lowe's hammers). I can tell you've never dealt with customers and requirements management, because understanding why customers need something is extremely important: it may lead to a better product for the customer or new products for more/new customers. Lastly, you must be new to the internet if you go around assuming anyone knows shit (especially on Slashdot).

People often ask for help, assuming an answer and thus embedding it in the question. The experienced helper asks probing questions to see what the asker really wants, and then asks that question. When you're older, you'll understand.

In this case specifically, embedded development typically requires specific "non-consumer" I/O requirements that little hobbyist systems just don't support. Thus, saying BeagleBoard or Udoo or RaspberryPi would steer him wrong.

OTOH, maybe he just doesn't know WTF "embedded" really means and is just tossing out the buzzword du jure, when a used laptop would serve his needs much better.

I find it incredibly annoying when I have accurately researched some topic and know what I'm doing, but when I'm asking about some detail, some jackass starts walking me through that whole jarring "why do you want to do that" dance!

Can you really not figure out that the solution to such a problem is to add more detail to your question, indicating what you've already researched?

It really isn't.

A ran into a fine example of why you are wrong just last week.

I was looking for a way for a.NET library developer to specify a type contract that included a non-default constructor with a specific prototype/signature. Now for some this may sound like an Interface, but others will argue that Interfaces should not specify implementation details and they (rightly or wrongly) include constructor prototypes as an implementation detail and argue that this is why interfaces should not (and do

The GPP is 100% right when he says "Just because you don't understand their needs doesn't mean you need to step in and try to change what you think they need. (Ever think they just MIGHT be smarter than you or know their needs better?)"

Which once again returns us to the basic questions being asked by the would be helpers: "What are you trying to accomplish?" Without that fundamental part of the picture, all but the most generic help is pointless.

To return to the current case in point, Say the person is trying to build a plant monitoring doo-hickey. The choice of platforms depends a great deal on a hundred little specifics. For example, if each device being designed will handle one plant only with just a few sensors, and it is going to ru

Which once again returns us to the basic questions being asked by the would be helpers: "What are you trying to accomplish?"

Its stated quite specifically already so when you then go and ask that, you are of course doing exactly what I said you would do, proving my initial response that it really isn't helpful to describe in excruciating detail what is being tried.

The important specifics are already there: I need my generic class library to enforce a constructor contract on 3rd party code that calls my library.

Maybe you imagine that there isnt a need for it, but thats just proving the GPPP's point also.. that you think you k

It's possible the person asking a question knows their stuff. It's possible. But we don't know that, which is why we ask probing questions.

..and by asking those questions you've trashed the original question.. so when the person asking does know their stuff the end result of your involvement is that you just fucked their thread over. At the very very best you've delayed any meaningful response by literally days because now everyone else is waiting for you to be answered.

Can you really not figure out that the solution to such a problem is to add more detail to your question, indicating what you've already researched?

Let's say you want to develop a 3D game that has to work in all the absolutely most crusty computers that can be found. Then you want go with OpenGL 1.x and the fixed function pipeline. Just observe all the whining that appears. How you should use shaders, and how even shader-based OpenGL 2.x is not sufficient but for some academic reasons you want at least 3.x because it has the core profiles, so that even accidentally you won't be using any legacy functionality. Even despite the fact that games like Angry

"My hands are tied. Even though I want to use OpenGL 3.x, I can't. The specs say OpenGL 1. So, can you help me with OpenGL 1 or not?"

I was a programmer, and now I'm a DBA. When I ask for help, people understand that -- for example -- when I say "the machine runs SQL Server 2008R2" that there's zero chance of upgrading to v2012 or v2014 just to solve one itsy problem: there's too much effort involved in QAing a huge production environment.

What happens instead is that people latch on to some irrelevant detail in your context and the discussion gets instantly derailed in that direction, thus ensuring that your question never gets answered. It's particularly fatal to mention motive, because that's completely subjective. The only way to actually get useful answers to questions these days is to trim the context as ruthlessly as you possibly can.

One day someone needs to write a "How To Answer Questions The Smart Way".

I find it incredibly annoying when someone THINKS they have researched some topic and knows what they are doing.

You may be perfectly right and capable and interested only in the detail that you're asking, but unfortunately you'd be in the minority. probing questions will confirm everyone is on the same page and while it may be annoying to you, it will be a godsend for many others.

Engineers in general are terrible at solving generic problems, which is ironic because we're thought of as the great problem solv

Yeah. People are asking questions because HIS questions are, when taken together, nonsensical.

He's looking for a good host machine to do development for ARM, MSP430, and other MCU embedded targets.

When doing embedded development, there is usually a very clear distinction between "target" and "host" - it is rare in the embedded world for people to use a device as both host and target (since the target is usually pretty weak CPU-wise), but he's implying that he wants to use a device that is usually a target

Which to anyone that has actually DONE this sort of development is nonsensical.

I happen to like doing my dev work directly on the BBB. I have a full scale machine that I use as a glorified display, and other servers that house subversion, and other needed resources.

Doing dev work directly on the BBB makes it far easier to deal with debugging problems in the field, because, by definition, I have my full debugging environment with me at all times. My dev environment is always exactly identical to the production environment, so I never have the "it works fine in the lab" scenario.

Try developing for an MSP430 on the MSP430... hence the distinction between 'target' and 'host'. When you can develop on the target, great. But a lot of these MCUs don't have the IOPS/RAM to run a generic scripting language, never mind a compiler.

An MSP430 has idle currents measured in uA, and a chip costs in the region of $1.50, with no external components required. BBB isn't useful in applications that require running off of a watch battery for a year, and isn't cheap enough to consider adding as an additional component in consumer electronics.

An MSP430 has idle currents measured in uA, and a chip costs in the region of $1.50, with no external components required. BBB isn't useful in applications that require running off of a watch battery for a year, and isn't cheap enough to consider adding as an additional component in consumer electronics.

and the MSP430 doesn't have enough horsepower for most things I want to do, and even if it did, the additional resources needed to design with it, and the additional time-to-market that these would introduce make it non-viable in todays world. As I said, time-to-market is everything. MS didn't get where they are because they made a superior product, they got there because they had a working product when the market opportunity arrived. Short TTM doesn't guarantee success, but TTM that is too long guarantees

So either he has VERY special unique requirements that he hasn't clearly communicated,

Why is low power consumption a special, unique requirement? All of my computer equipment was chosen and/or assembled with low consumption in mind. My Desktop's TDP is under 350W and I can play games at 1920x1200, albeit not with everything turned on any more. I have a small fleet of netbooks for performing long-running tasks or for traveling, I sold an HP EliteBook and bought three of them. I even took an EEE 701 4GB running Jolicloud on a six-week vacation to Panama. My most power-hungry portable has two c

Actually the question is really freaking stupid. If you're serious about embedded development you cross-compile on the fastest computer you can find and then deploy to the target board. If the question had been "what's a good embedded target" it would have made marginally more sense, except that parameters there are how much i/o volume/cost, performance, etc.

For all we know this guy does development work inside his van and wants to drain the battery as slowly as possible.

Asking "why" is often a pretty good question, because you can't tell from the very first question that the person actually knows their needs, or indeed knows what they fuck they are doing at all. Development on embedded systems is slow, the processors are slow. Is the poster willing to put up with slow compile times compared to a laptop?

They want a low wattage test system for doing embedded dev. Period. Don't skirt around it, don't try to poke and make fun of anything he says in the comment, either you can't help him or you can. MOVE ON.

The person doesn't really provide a power budget. Low power compared to what?

Are we talking a device that's going to need to run off of battery power for hours or days? Are we talking about a device that's going to be silent (no cooling fan)? Are we talking about a device that can have a cooling fan as long as it delivers good performance per watt? Who knows, the question doesn't specify.

Because there may be a better way to accomplish it. If your kid came to you and said dad can I get a rope, three pounds of butter and a mule? Would you give it to him or ask what he was trying to accomplish?

Alright, I've been working on getting a build server setup on a BeagleBone Black that I had lying around. The ARM->x86_64 compiler I had to build so the executables could run elsewhere was a pain, so be careful about that. Also, the speed at which it compiles is "dog" slow. It reminds me of the stories I used to hear from old programmers about turning in their punch cards and waiting a day to get an answer back. It's not that bad, but it is slower. nohup quickly becomes your best friend. If it sounds lik

Zotac is currently bringing passively cooled quad core mini Intel boxes to market (the low end NUC has a fan but doesn't really need it under normal load). The Zotac ci320 nano looks particularly nice: Celeron n2930 (quad core, 1.8GHz) with a thermal design power of 7.5W and an even lower scenario design power. It offers a much better interface selection than the NUC: plenty of USB3 ports, display port, HDMI, eSATA, (shared SATA and mSATA inside). Costs about the same as the low end NUC.

You can get a netbook that will draw around 5-10W. If you get one with intel cpu and chipset you will have the advantage of massive compatibility, especially if you skip the original Atom chip. Once the dual cores came out it was pretty well abandoned by everyone.

That, or get one of these ~$100 android units which also runs Debian. But I don't really recommend that. The only one which seems very performant and yet inexpensive is the mk908 which is a bit of a turd reliability-wise and which doesn't yet have complete hardware support, e.g. http://www.cnx-software.com/20... [cnx-software.com]

I got a dual core atom netbook for around 60 bucks, bumped it up to 2 gig of ram, and slapped a 32gig SSD in it, runs a couple days on a battery charge and in total have about 125$ into it thanks to ebay

So what? A dual-core atom is actually pretty snappy. I personally have a crufty single-core atom, that is quite pathetic and I wouldn't suggest it to anyone. What I actually use for a 'netbook' is a Gateway LT3103u with the L110 chucked for an L310. Since Gateway finally released windows 7 x64 drivers and a BIOS with AMD-V it blew the hacks wide open. I haven't done a custom DSDT yet (though I should) but I do have SATA running in AHCI mode, necessary for automatic TRIM support. That took a hacked BIOS, but

Yes, and so do many netbooks. My Acer Aspire One D250 only came with 1GB, as did my EEE 701, but those are old. My LT31 came with 2GB. I upgraded the dimm to get lower-latency memory, mostly to speed up the integrated graphics. Most of the machines that will support 1GB will support 2GB and some of them will even take a 4GB SODIMM, but don't count on it.

Finally, an Ask Slashdot I can answer with personal experience and some authority!

Do yourself a favor and order a Shuttle DS437, I bought one myself and cannot think of a better little box for playing with embedded systems. Here's why:

Its small -- about the size of a 5.25" disk drive.

Its low-power -- not as low as you'd like -- but less than 20watts under load for the system. Its passively cooled.

It takes a 12v barrel-plug from a standard 65watt laptop power adapter (included) -- easy to replace anywhere in the world. Also good if the impetus for your low-power requirement is an exotic wish, like being able to run the system from battery or solar.

Its relatively inexpensive -- about $200 from Amazon.com, and qualifies for Prime shipping. You'll need to add storage and RAM, but maybet have some DDR3 so-dimms and a spare 2.5" drive kicking around from an old laptop.

Its got two DB9 Serial ports, right on the front. Handy!

Its a modern system: 64bit, dual-core, Ivy Bridge, SSE 4.2, supports up to 16GB ram.

It took Ubuntu 14.04 without any significant fuss. Most things worked out of the box. I'm not a linux super expert, but got the rest working within an hour or so.

It's "only" 1.8Ghz, but we're talking Ivy Bridge here, not some wimpy Atom or ARM core. Plus, in my experience you really want x86 for your host machine. Not every compiler or tool you might want to use is going to be supported on, say, a lower-powered ARM system.

I considered a lot of exotic ARM boards as my development host, including BeagleBone, Jetson-K1, and a handful of others. I think the D437 leads by a wide margin, but for what its worth I considered the Jetson-K1 board a distant runner-up.

Also, forgot to add -- Beware the potentially-confusing Shuttle DS47 -- it's nearly identical in appearance and pricing, but has a dual-core 1.1 Ghz Atom-based CPU inside which is significantly slower than the ivy-bridge (3rd-gen i-series) processor in the DS437.

Looks like you didn't read the question. The inquirer isn't looking for an ARM system: The specification is for a low power system which runs Linux so that it will support the toolchains for ARM and other common embedded CPU architectures. The Shuttle DS437 runs Linux, as described by Ravyne. Idle power consumption certainly fits the desired envelope of 5-10W. It's a little more under load, but you get vastly more processing power in return for that, and the widest support of developer tools available. Ther

Raspberry PI: don't like the fact that you have to boot off the sd card.BBB - no complaints, nice board and has an optional display that's pretty nicePcDuino - my favorite, more memory and flash than the other 2 devices and the v3s is in a really nice case.

I have a Pi and a PcDuino v2 and the PcDuino is definitely more capable and doesn't cost all that much more. The Arduino compatbility and WiFi are nice, as is having enough flash on board to boot without an SD card, although I generally use one since they are faster. I use their LinkSprite shield for prototyping things, since it breaks out the IO pins into a nice connector for easy use.

The ARM architecture has some fairly good Linux support and wide adoption.

One of my favorites out there today is the A10-OLinuXino-LIME This is a low cost 1GHz ARM board with a Mali-400 GPU, a SATA port, 100BT port, two USB ports for under $50. I'm a big fan of the SATA port... using a SSD for the system solves many reliability problems. It also has support for LIPO battery but I haven't tried it.

Perhaps the best value/performance is the Wandboard QUAD. Quad iM.6 with 2GB Ram, WiFi, SATA, and an OpenCL

The Beagle Bone was good in its day, but it is kind of over the hill. The processor is underpowered compared to other ARMs

Just to be clear, the A10-OLinuXino-LIME, BeagleBone white and BeagleBone Black all contain a single Cortex-A8 core, and the TI AM3359 runs at the same 1GHz speed in the BBB as the Allwinner A10 does in the LIME.

The original BeagleBone (white) ran its AM3359 at 720MHz so its CPU performance is a bit less, but the BeagleBon

Because of the poor reliability of MMC, I prefer to use SSD these days

MMC reliability is fine. I thought I was going to have problems using the MMC on the BBB as well, so I set about beating several of them severely. I setup accelerated read-write-read testing, and started pounding on the BBB internal MMC. with 3 boards at over 5M writes to a single 512Byte block each, none of the devices failed. I read some literature which suggests that the MMC rotates sector usage to even out wear, which, if true, means that you would have to do the equivalent of recording 100,000 hours of

I'm sure that TI will be making 335x until people stop buying them. TI generally doesn't EOL parts like that. But putting whole BBBs in products seems a bit risky for a lot of other reasons.

As opposed to undertaking to spin your own processor board? The BBB is a complete functional platform that is cost competitive for all but the largest quantities, and shows all the signs of being at the beginning of its life-cycle. Its undergone 2 minor revisions in 12 months, and there are several active design communities. The list of peripherals is growing by leaps and bounds. Lastly, by Beagleboard.orgs own accounting, the demand far exceeds the supply, and people are clearly using them as more than jus

The comoditization of embedded hardware designed happened over a decade ago. Have you heard of Kontron? PC104? Com Express? You seem to have missed the 2000's.... this is nothing new. These days it is amazing what is put on a DIMM module - far more than the Beagle Bone and Pi toys provide and at far lower unit prices.

The commoditization of these designs depended on several factors happening all at once.

First, processor power had to pass a threshold. Having a processor that is fast enough to handle an embedded system running a custom operating system (or more likely just a simple set of interrupt handlers and startup code) is a lot slower than the processor needed to run a full fledged kernel like modern Linux. The custom Software saves huge amounts of unit-cost, at the cost of time-to-market.

Why do you want to _develop_ on an embedded system ?!? Use a Linux PC for development and then test your code on your embedded platforms. I use Ubuntu for the former, with either buildroot or a direct gcc eabi. If the development platform _must_ be low power, like you develop from an african field with a solar panel, get a netbook.

Look at the Olimex range of boards.
I've been using these for a year or two and found them to fit the bill nicely.

There are single and dual core boards, with / without embedded flash memory (or micro-SD card slots) and they'll run Debian (or other) Linux
They have a lot on on board peripherals and pinouts for their own range of LCD screens - though I use an HDMI monitor for simplicity.
The power supply will accept anything from 6 - 16 Volts from a phone-charger type PSU and you can even plug in a LiPo for

http://ark.intel.com/m/product... [intel.com]
The Intel Silvermont Atom boards are very electrically efficient and offer surprisingly good performance.
You can buy a board for under US$100 and all you need to add is case, PSU, RAM and mass storage.
Some boards have VGA, some DVI, with or without legacy serial and parallel, lots of choices. Manufacturers include gigabyte, msi, Asus, supermicro.

Personally I'd have a netbook...
In a Lab (Garage, whatever) I'd have the Oscilloscope/Logic Analyzers meters power supplies etc... connected to a Beefy desktop with plenty of RAM for running VM's so I don't stuff around with the build environment.
When coding just remote (SSH/RDP/whatever) to the VM of choice.
You don't mention price so stick solar panels, batteries and inverters on it till you're sub 5W.

This is a new chip with a ARM Cortex-A5 core, making it directly compatible with all distributions with an 'armhf' port like Debian, Ubuntu or Linaro.I like the fact that it is compatible with the Arduino Due connector. It's probably the easiest Linux based Arduino hardware compatible board.

Even an old 100 watt laptop will compile your code many times faster than something like the Black or Ras Pi will. A gig of RAM and swap space, something that embedded systems don't normally have, will make a huge difference. Just throw a small SSD (or boot from USB stick) in an old 2-code i3 with a crap graphics card, and what your code compile faster than a 5 watt embedded device will even launch your IDE or get through the first source code file.

How much does it really matter when a small project will only take 15 seconds to compile on the BBB. So what that a cross-compiler can do it in 2 seconds. it'll still take close to 15 seconds total when you include the time to type the commands to download the executable. Even if it was only 7 seconds, it is still only a negligible gain.

now if you were compiling a kernel, or god-forbid something really big like open-office, or some such then I could understand, but for the vast majority of embedded work, i

We don't know what the OP is attempting to compile. It might be just some code to run inside the Arduino bootloader, or it might be a whole muLinux and a local GCC for the target. Heck, the Debian distro for the BBB might not have a binary for the target muC, which means compiling GCC before compiling the code (worst case). There is also the possibility of OPs chips not being supported in GCC. For example, I just picked up some Cypress PSoC boards; the tools for them are only available right now on Windows.

There's guides on the internet to installing different versions of Linux on it. (Unless you want to do Android dev)
I bought the A1000 version and a laptop HD (plugs in the top).
Then I installed Debian (using an online guide) and MiniDLNA. I use it as a media server for my TV.

The embedded ARM boards from Technologic Systems [embeddedarm.com] are worth looking at also. I used a TS-7260 with a large enough SD card to install Debian with gcc and it worked great. It booted nearly instantly and consumed something like 100mA of current at 3.3V IIRC. It was quite a robust little box. There are newer and faster models than the TS-7260 at the link I provided above.

I just checked and they state that the TS-7260 draws half a watt minimum and 2W typical... I seem to recall it drawing even less than that though. Perhaps it was 200mA @ 3.3V which would fit within their spec.

I am developing a system targeted to run on a wandboard (www.wandboard.org), which is a really good "embedded" system similar to the BeagleBoard, but uses a Freescale iMX6 A9 Arm processor and is available in single, dual & quad core CPUs with 512MB to 2 GB RAM.

However, even with the quad core 2GB ram version, builds take a really long time, so I use a regular PC that I built using a new Haswell CPU, 16GB ram and a 240GB SSD to do development and even unit testing. I can make several changes, build ever

Don't buy anything today. Wait until there are media boxes with quad Cortex A15/A17 chips and buy one of them. They'll be out any week now. Rockchip RK3288 is coming, should be affordable, and the company is spending a lot of effort making sure it's well supported in mainline.

Cortex A9 hails from 2007. It's ancient. The GPUs are at best old Mali-400's. The compute/watt is not-great.

If you want to go really low power- if battery life is your concern and you don't actually have serious CPU use (you mention MSP430, so it sounds like you don't have real CPU use needs) get a Cortex A7 or Cortex A5. There are dozens of dual core Allwinner A7 boards out there. A5 has slimmer pickings, but will get you pleasantly below the one watt range, and the boards come with more embedded targeted peripherals that might not be included on media devices.

"Rockchip RK3288 is coming, should be affordable, and the company is spending a lot of effort making sure it's well supported in mainline."

Citation needed. Mind supporting your statement with a link? AFAIK RK has one of the poorest FOSS support among Chinese SOC makers (compared to Allwinner and Amlogic). The RK source code floating in the net tend to be "leaks" or in any case releases that aren't official supported by the company. Also for a long time there was no official way to flash firmware onto the em

The vendor supplied tools may be Windows only, but chances are there is a gcc backend available for the target architecture these days. I wouldn't like to be using an ARM board for my cross compiling though, getting QEMU set up for any compilation steps that need to run on the target architecture is enough of a nightmare on Intel, let alone other architectures that noone has used that way before.

The new BBB version is popular and hard to find but the extra flash is nice.

https://specialcomp.com/beaglebone/index.htm

How many do you want? These are not from Beagleboard, and are not Beagleboard certified, but they can be had in almost unlimited quantities and work as advertised. They are manufactured to the open specs. They are being manufactured by a third party in China in vast quantities for commercial use. They cost more than the official versions mostly because you can actually get your hands on them from these guys in nearly limitless quantities (up to 100+ they have in s

If you are building a project which requires some special hardware then you don't have to waste time porting a driver from x86 to ARM, MIPS, Sparc, etc.

In my experience, most embedded machines these days are ARM, and finding x86 ports is more of a challenge. Between the explosion of ARM based cell phones, and the Rpi/BBB, x86 is becoming less and less relevant (and with it MS/Intel)

Active android development is almost all ARM, and x86 is ported as an afterthought.

The vast majority of cell phone makers use ARM based processors, and with Smartphones, battery life is a gigantic deal breaker. This would suggest to me that large numbers of design engineers concluded that ARM was at the very least "good enough" in power efficiency to allow its use.

This leads me to conclude that either ARM is better in this department, or that the difference is trivial enough that other trade offs make it worth it.