Researchers have created a metal molecule that could bring down the cost of solar cells by replacing the more expensive and rarer metals used in their production today. Specifically, a team at Lund University in Sweden has created an iron molecule that can work both as a photocatalyst to produce fuel and in solar cells to produce electricity, they said in a Lund University news release.
Iron Is Easier
With iron present in six percent of the Earth’s crust, the metal is significantly easier to source than rare metals, such as ruthenium, osmium, and iridium, typically used in this type of technology, said chemistry professor Kenneth Wärnmark of Lund University, who led the research. “Our results now show that by using advanced molecule design, it is possible to replace the rare metals with iron, which is common in the Earth’s crust and therefore cheap,” he said in the news release. 99 ceme online
Some photocatalysts and solar cells are based on a technology that uses metal complexes to absorb solar rays and use their energy. The problem is, these metals are typically rare and thus expensive and difficult to source, driving up the cost of the technology in which they are used.
For some time, Wärnmark and his team have worked to find alternatives to expensive metals, focusing on iron because of its prevalence. In previous research, they already produced their own iron-based molecules and proved that they could potentially be used for solar-energy applications, he said.
Glowing Long Enough
Their latest research demonstrates the development of a new iron-based molecule that can capture and use the energy of solar light for a long enough time that it can react with another molecule. The molecule also can glow long enough so researchers can, for the first time, see iron-based light with the naked eye at room temperature. Researchers published a paper on their work in the journal Science.
This new molecule shows versatility in its uses, researchers said. Its key application could be in new types of photocatalysts for the production of solar fuel in two ways—either as hydrogen through water splitting or as methanol from carbon dioxide, researchers said. Another potential use is in the production of light-emitting diodes.
Indeed, even researchers were surprised at how quickly they were able to turn an iron molecule into a material for photochemical applications, with properties on par with far more expensive and much rarer metals, Wärnmark said. “We believed it would take at least ten years,” he said. His team—which also included researchers from Uppsala University and the University of Copenhagen—managed to do it in five.
Elizabeth Montalbano is a freelance writer who has written about technology and culture for 20 years. She has lived and worked as a professional journalist in Phoenix, San Francisco, and New York City. In her free time, she enjoys surfing, traveling, music, yoga, and cooking. She currently resides in a village on the southwest coast of Portugal.
There’s a reason the U.S. trade deficit with China exceeded $301 billion in 2018. Cheap labor and fewer regulations mean that “Made in China” tends to cost less than “Made in America.” But the ongoing trade war with China has led many companies to re-evaluate the practice of outsourcing from a cost-efficiency perspective. That’s with good reason, since the U.S. is currently imposing about $250 billion in tariffs on products coming from China.
The notion that outsourcing could be detrimental in the long term isn’t a new idea—nor is it specific to the current tariffs. In the last few years, a number of companies have realized that there are often hidden costs associated with outsourcing. One of the more famous examples is the UK’s renowned car manufacturer, Aston Martin. play bandarq
Aston Martin’s Outsourcing Headaches
In 2014, Aston Martin issued a recall for 17,950 of its vehicles. The cause was directly related to outsourcing. The recall debacle began with Aston Martin receiving multiple customer complaints about broken throttle pedals in certain vehicles manufactured after 2007. In response, the company launched an investigation into the pedals, and the engineers quickly uncovered the issue: The pedals had been manufactured with counterfeit material. The company’s engineers had specified that the pedals should be constructed with injection-molded DuPont PA6 plastic, but the broken pedals were made with counterfeit resin.
The backstory of these faulty pedals is a maze of complex supply chain dynamics. The pedals themselves were assembled by Precision Varionic International (PVI) in Swindon, England. According to its website, PVI does tooling in-house, but uses manufacturers based in China. In this particular instance, PVI got the parts from Hong Kong-based Fast Forward Tooling. The subsequent supply chain is murky, but Aston Martin believes that Fast Forward Tooling subcontracted the actual molding of the pedals to Shenzhen Kexiang Mould Tool Co., which in turn purchased the counterfeit resin from Synthetic Plastic Raw Material Co. Ltd. in Dongguan, China.
In reality, the precise origin of the counterfeit material is unknown, as Shenzhen Kexiang Mould Tool Co. claims to have had no dealings with Fast Forward Tooling. What is clear, however, is that Aston Martin was far removed from the business end of its supply chain, and the quality of its product suffered. Fortunately for Aston Martin, the financial implications of the recall were not catastrophic. But they certainly could have been.
Small Issues Can Cause Significant Problems
The benefits and pitfalls of outsourcing are by no means specific to Aston Martin or the automobile industry. Offshore manufacturing, in particular of plastics, can be less expensive than partnering with a domestic manufacturer—at least initially. Yet with that immediate gain comes an inevitable loss: the visibility into who is actually manufacturing your parts—and how they’re doing it. Without insight into a subcontractor’s processes and materials, it’s nearly impossible to guarantee the quality of the product. And in the long term, seemingly small issues can work their way up the supply chain and cause significant problems for your business and reputation.
That’s why even before the trade war, many companies began reshoring plastics manufacturing to domestic partners upon whose practices and processes they could rely. In some cases, to offset costs, these companies would keep their assembly plants offshore in order to take advantage of lower labor prices.
However, as the trade war with China shakes out, reshoring the manufacturing of plastic parts is something every business should consider for long-term financial health.
Jason Middleton is the VP of sales and development at plastics manufacturer Ray Products. Founded in 1949, Ray Products provides advanced thermoforming solutions to clients in a wide range of industries, including medical device manufacturing and transportation.
After decades of being looked at as more of a subculture (or arguably counter-culture) in the larger technology landscape, open source is finally getting its due. 2018 saw some big moves in the open source landscape—from automotive startups leveraging open source in their innovative products to some big acquisitions and further growth of the open-source hardware market.
That’s not to say that open source has shed all of its challenges, critics, or even stigma. But the road ahead offers plenty to be optimistic about for open source enthusiasts. bandar ceme 99
Big Acquisitions Mean Big Faith
Over the summer, Microsoft turned heads when it announced it would be purchasing GitHub, the Internet’s largest repository of open source code, to the tune of $7.5 billion in stock. Microsoft said the deal was part of its growing commitment to support developers in the open source community, with Microsoft CEO Satya Nadella saying the company is “all-in on open source.”
The acquisition has drawn its fair share of criticism and concern from those who view this as a Big Brother move: a large, faceless corporation digging its tentacles into a smaller, innovative community. Some GitHub users even removed their code and headed for alternative sites in the wake of the deal.
How Microsoft’s ownership will impact GitHub at large still remains to be seen. But the fact that a large company like Microsoft would invest so heavily into open source portends that it views open source as a viable source of possible IP and other solutions.
In October, the community saw another high-profile acquisition in the form of IBM’s purchase of Red Hat—the company most noted for Red Hat Linux. In recent years, Red Hat has been branding itself as a provider of open-source enterprise software solutions, which is precisely what attracted IBM to acquire all outstanding Red Hat stock at a total enterprise value of approximately $34 billion.
IBM is betting that the inclusion of Red Hat will bolster its portfolio of enterprise-level cloud solutions. In the past, IBM had functioned as more of a collaborator with Red Hat, helping to develop enterprise cloud solutions and grow Red Hat’s enterprise-grade Linux business.
In a press release regarding the deal, IBM chairman Ginni Rometty said the Red Hat acquisition will make IBM the world’s number one hybrid cloud provider. “Most companies today are only 20 percent along their cloud journey, renting compute power to cut costs,” Rometty said. “The next 80 percent is about unlocking real business value and driving growth. This is the next chapter of the cloud. It requires shifting business applications to hybrid cloud, extracting more data, and optimizing every part of the business, from supply chains to sales.”
“Joining forces with IBM will provide us with a greater level of scale, resources, and capabilities to accelerate the impact of open source as the basis for digital transformation and bring Red Hat to an even wider audience—all while preserving our unique culture and unwavering commitment to open source innovation,” Jim Whitehurst, president and CEO of Red Hat, said in a press statement.
Like the Microsoft/GitHub deal, it’s too early to see the actual impact of the IBM/Red Hat deal just yet. In contrast to Whitehurst’s comments, some—like Forbes contributor Jason Bloomberg—have questioned whether IBM and Red Hat’s company cultures can actually gel.
Open Source on the Road
In part, what’s made companies apprehensive of open source in the past has been its inherently open nature, which brings up natural concerns around IP protection as well as cybersecurity. Those fears appear to be quelling, however, and more and more companies are opening up about the use of open source in their development. Some are even releasing open source data sets to the larger developer community.
FLIR Systems—the largest commercial manufacturer specializing in thermal imaging sensors, components, and cameras—recently released a thermal imaging dataset of over 10,000 images to the wider community. The hope is that engineers working on ADAS and other machine learning applications for connected, semi-autonomous, and autonomous cars will use the thermal images to improve the artificial intelligence behind these systems.
In an interview with Design News, Mike Walters, FLIR’s vice president of micro-camera product development, said any risks associated with open sourcing its dataset were far outweighed by the potential benefits of allowing OEMs and startups to create safer vehicles. “Providing a free, annotated thermal image starter dataset for ADAS and autonomous vehicle developers will help speed up testing of thermal cameras and ultimately improve safety and reliability within the ADAS sensor suite,” Walters said.
Elsewhere in the automotive space, experts are looking at open source as a means of bringing much-buzzed-about blockchain technology into car manufacturing, sales, and vehicles themselves. The biggest advocates in this arena are the members of the Mobility Open Blockchain Initiative (MOBI), a consortium of companies and automakers dedicated to accelerating the adoption of blockchain in automotive.
The biggest hurdle for blockchain in the automotive space today is the lack of standards and best practices. MOBI advocates for an open source solution to this. By open sourcing blockchain software, the organization believes OEMs and developers can more easily create a set of standards and tools that everyone can use to make auto plants and vehicles safer and more secure in an increasingly connected vehicle landscape.
Open Source Hardware Gets Serious
But perhaps no aspect of open source saw more excitement in 2018 than open-source hardware. Building custom silicon chips—a notion once far too impractical and costly for all but the largest chipmakers—is now a serious reality, thanks to RISC-V, an open-source instruction set for building chip architectures.
In November, the RISC-V Foundation, a nonprofit aiming to drive adoption of RISC-V, announced a partnership with the Linux Foundation to accelerate the development of RiSC-V by offering more tools, education, and support to the RISC-V developer community—including legal and marketing expertise. “RISC-V has great traction in a number of markets with applications for AI, machine learning, IoT, augmented reality, cloud, data centers, semiconductors, networking, and more,” Jim Zemlin, executive director at the Linux Foundation, said in a press statement. “We look forward to collaborating with the RISC-V Foundation to advance RISC-V ISA adoption and build a strong ecosystem globally.”
SiFive, a chipmaker founded by some of the developers of RISC-V, kicked off 2018 by releasing the HiFive Unleashed, the first Linux-compatible RISC-V SoC. The release marks the first time that engineers will have access to a high-quality development board platform based around a RISC-V chip.
Other companies followed SiFive’s lead throughout the year. In early December, Microsemi—a subsidiary of Microchip Technology—announced it was extending its family of FPGAs with a new RISC-V-based SoC. Using a core complex developed by SiFive, Microsemi’s new PolarFire SoC aims to offer embedded developers a low-power and more flexible solution for developing FPGA-based IoT devices.
December also saw data storage giant Western Digital announcing that it was beefing up its own internal RISC-V developments—specifically, an open source RISC-V instruction set simulator, an open standard initiative for cache coherent memory over a network, and its very own open-source RISC-V core, the SweRV core. This news comes in the wake of Western Digital’s prior commitment to transition one billion of its processor cores to RISC-V.
“As Big Data and Fast Data continue to proliferate, purpose-built technologies are essential for unlocking the true value of data across today’s wide-ranging data-centric applications,” Western Digital’s CTO, Martin Fink, said in a press statement. “Our SweRV Core and the new cache coherency fabric initiative demonstrate the significant possibilities that can be realized by bringing data closer to processing power. These planned contributions to the open-source community and continued commitment of the RISC-V initiative offer exciting potential to accelerate collaborative innovation and data-driven discoveries.”
The SweRV is a 28 nm, 32-bit, 9 stage pipeline core with a clock speed of up to 1.8 GHz. Western Digital plans to use the core in its own embedded products, including solid-state hard drives and Flash controllers, and also hopes that open sourcing the core will help drive development of IoT, secure processing, and industrial control applications as well.
Other companies are releasing RISC-V chips for more specific applications. California-based startup Esperanto Technologies, for example, is developing RISC-V-based chips designed for artificial intelligence applications, including machine learning and deep learning.
Careful Steps Forward
But with all this excitement, open source is not free and clear of warnings and cautionary tales—particularly in the embedded realm. A 2017 survey by GitHub outlined a number of concerns among members of the open source community. Paramount among issues encountered by developers using open source were incomplete or confusing documentation and a lack of responsiveness and support from developers. Among those surveyed by GitHub, 93% said there is a “pervasive problem” in the open-source community with incomplete or outdated documentation.
Concerns like these are why many will still encourage developers to shy away from open source. Writing for Design News, Jacob Beningo, an embedded software consultant, outlined several reasons why engineers would be wise to avoid open source. In addition to the issues outlined in the GitHub survey, a lack of a traceable development cycle, potential IP conflicts, a lack of testing, and difficult integration can all be hurdles toward implementing open source, according to Beningo.
Rod Cope, CTO of Rogue Wave Software, spoke with Design News regarding pervasive issues with developers and engineers failing to properly document, or understand, their use of open source. Cope told Design News that in the rush to find quick solutions to coding problems, developers will often pull a bit of open source, rather than write new code from scratch. It makes sense if the code has been tested and works well. However, as Cope said, in many cases, developers won’t tell anyone that they’ve done this.
“That will happen across teams—especially in a decent-sized organization, where the developers don’t necessarily know all the components that are being brought into a device,” Cope said. “…Before you know it, there are a bunch of different components being put in where only one person knows about each of them. And suddenly, you’re shipping something with a lot of open source in it. And some of those licenses can run into trouble, depending on the way you use them.”
Cope did, however, say there are ways for developers to navigate these tricky waters. As more companies integrate open source, best practices are vital. While the origins of code may not be as much of a concern for DIY hobbyists and Makers creating projects in the privacy of their homes, large companies and serious developers face big consequences if they aren’t careful.
“The one thing I always try to point out to people is: Once you’ve downloaded open-source code, whatever you bring in, you own it in the sense that you’re responsible for what’s put into your embedded device,” Cope said. “…You can’t just kind of sit back and say, ‘The community will tell me if something goes wrong.’ You’re not subscribing to the open source community. You have to proactively reach out and check, so make that part of your process around adopting open source.”
Traditionally, preventive maintenance (PM) relied on industrial or in-plant average life statistics, such as mean-time-to-failure (MTTF), to assist in the scheduling of maintenance events. With PM, a service log served as the communication instrument alerting plant or industrial maintenance technicians of machine breakdowns occurring over time. The log required documenting detailed information on the machine and its breakdown conditions by the plant or industrial maintenance technicians. Although the process was effective in recording the occurrence after the repair, machine breakdowns were still quite costly. aduqq
Today, however, maintenance activities can be effectively minimized by using predictive analytics (PA). With PA, a machine’s system efficiencies and electrical and mechanical conditions can be directly monitored using a variety of non-invasive measuring instruments.
What Is Predictive Analytics?
Industry 4.0, or the Industrial Internet of Things (IIoT), uses connective devices or machines to aggregate data to the cloud. With this aggregated data, statistical algorithms (SA) and machine learning (ML) techniques determine the device or industrial equipment’s future performance outcomes. These future outcomes are traditionally identified as the machine’s health or behavioral status. The primary objective of PA is to assess what will happen in the future based on previous or current knowledge and experiences. PA drives Preventive Maintenance (PdM) activity through data analytics and modeling techniques of ML. Fault detection is a critical PA concept in PdM, which is well accepted in Industry 4.0.
The most reliable predictor of faults is based on the prominent algorithm of Principle Component Analysis (PCA). PCA allows the analysis of aggregated data to predict a future fault within an industrial machine. With PCA, catastrophic machine failures can be avoided using this effective predictive algorithm.
Predictive Maintenance Tools
To collect factual data on an industrial machine’s mechanical or electrical condition, sensory-monitoring devices are required. The data provides information for scheduling maintenance events on the industrial machine. Such a schedule can minimize or possibly prevent electrical and mechanical breakdowns of the plant floor equipment. PdM can capture machine anomalies before they become serious. There are three predominant PdM categories with associated tools that can assist the industrial maintenance or plant technician in this data aggregation process:
Industry 4.0 is heavily dependent on electronic sensors for collecting industrial machine performance. Electronic sensors allow slight changes to be monitored using wireless communications such as BLE (Bluetooth Low Energy) or WiFi protocols. ABB’s Ability Smart Sensor allows monitoring of low voltage motors with a WiFi enabled unit. The Ability Smart Sensor can monitor the motor’s health parameters, such as vibration, bearing condition, and temperature.
Analytics and Monitoring Tools
Analytics and monitoring tools measure an industrial machine’s performance on a regular basis. System parameters are measured and tracked over time. The aggregated data allows corrective action to occur before a catastrophic failure occurs. The software used to collect the data can perform the analytics and predict appropriate repair cycles for the industrial equipment. A remote monitoring system developed by Fluke allows data logging of the health status of industrial machines. The Fluke Condition Monitoring tools allow capturing of data and making PdM decisions based on trendline analysis (Linear Regression ML). The data is aggregated from the condition monitoring tools to the cloud using Fluke’s gateway device and software.
With data aggregated to the clouds, PM scheduling activities can be developed. The PdM is a subset of the planned PM cycle. This data aggregation is accomplished based on IIoT electronic sensors, condition monitoring systems and ML algorithms. The goal of PdM is to monitor and act on the industrial equipment when necessary. Automation techniques used in schedulers help deploy PdM activities to the industrial maintenance or plant technician. The primary function of the scheduler tool is to provide maintenance analysis data based on the trends identified within the data collection software. The PM scheduler will then appropriate the resources necessary to carry out the planned maintenance of the industrial machine.
With the assistance of PA, PdM can help in minimizing machine downtime. Electronic sensors, condition monitoring, and scheduler systems aided by PCA and ML techniques can ensure significant cost savings within an industrial environment. Additional information on Fluke’s Condition Monitoring tools can be found on its website. The ABB AbilityTM Smart sensor information is available on the website as well.
Don Wilcher is a passionate teacher of electronics technology and an electrical engineer with 26 years of industrial experience. He has worked on industrial robotics systems, automotive electronic modules and systems, and embedded wireless controls for small consumer appliances. He’s currently developing 21st century educational products focusing on the Internet of Things for makers, engineers, technicians, and educators. He is also a Certified Electronics Technician with ETA International and a book author.
Here’s how to build your own smart lights that not only sense motion, but can also respond to Alexa commands and even send text and email notifications.
While there are more than a few applications that would benefit from a motion detector connected to a light or alarm, my first priority is the dark hallway to my bathroom.
This offering is a voice-actuated sensor, voice-configured PIR senor-based light that can light a dark hallway and double for a security monitor or a smart light for your yard or driveway. It won’t just turn on a light. Thanks to an MQTT or IFTTT (If This, Then That) link or service, it can text you the time of an event, trigger a sequence of events, or even dial 9-1-1. bandarqiu
This project includes an Alexa voice-based light on, light off, set floor light time (10 to 99 seconds), and enable action modes for off, light, and/or email services.
There are two packaging form factors. The first was installed into a Sonoff Basic 8266.The second has less hardware, but uses an NMOS transistor to switch the light fixture. Both have any difference in their code integrated into the provided .ino program file and selected with a few Boolean flags. The differences are limited to changes in pin allocations available from the Sonoff and the ESP-01.
It is coded to allow functions to be Alexa, voice, and cell phone based and to allow Alexa routine-based timing and customized sequences to be created from its app.
The hardware uses only the pins available on the module. The module is post regulated from +5 volts found on the board. A header is required to be installed to re-code the processor and those pins. A tapped +5 volts taken from the Sonoff module is also exploited in this project.
The Sonoff Basic’s module is hardwired on its circuit board. The relay to pin 12, button (pin 0), and the module’s LED control are found on pin 15. Included on the installed header, referring to the schematic, the hardware uses the TX/RCV, fixed to pins 1 and 3 with ground. The 3.3V and pin 14 are also available and are uncommitted. With minimal changes, an ESP-01 module can be used for applications that don’t warrant a Sonoff Basic. The ESP-01 provides pins 0,1 (Tx), 2 and 3 (Rcv) to its header and does not provide power, button, LED, or a relay.
The included coded application provides the required SMTP protocol format and specific message used in this project, including the ‘time.nist.gov’ based time tag. It is part of the project recipe and may be considered a part of your tool box. Basically, the application sends a formatted SMTP message request to the server and the server forwards it to the requested destination. I’ve found that there are several SMTP server services and other resources to accommodate your custom needs. I have chosen to use smtp2go.com to support this application. The free option provided what I was looking for in an SMTP server.
You can implement this without a server service, by using your own server on either a Sonoff or an ESP-01 module and dealing with your Internet service provider’s destination name, input server address, and port information yourself. You may also email directly to your cell phone as an SMS message. For AT&T users, you can send simple messages with your 10-digit phone number (written as one long word with no spaces or hyphens) followed by @txt.att.net. For Verizon, email your phone number followed by @vtext.com, paying attention to use double @ symbols. For Sprint, email the number followed by @messaging.sprintpcs.com. And for T-Mobile, the number is followed by @tmomail.net. Other phone services may have similarly formatted account references.
Including an email service in your application also allows the Alexa-based apps to connect through IFTTT services, using the ‘mail’ trigger, to any of hundreds of IoT-compatible devices based on custom devices.
Alexa can be used as an IFTTT trigger, but does not allow its connected smart devices, with some exceptions, to be directly used in the trigger criteria. Additional code using webhook or adafruitio service may be an alternate choice for emailing. Voice-controlled, custom project data can then be sent thru an IFTTT email or webhook trigger to either a service using a hash tag/passed value in the subject line of or to the io.adafruit MQTT broker directly or indirectly through the IFTTT service. The IFTTT connection, however you trigger it, may then connect to the compatible appliances or another custom application.
In this way, the voice and phone based motion detector described here may be used with an IFTTT service action that’s not available from the Alexa, through an Alexa based routine, or available Alexa based app. Used as a security related detector, the IFTTT can also call 9-1-1 or another phone number, send an email, or turn on a light or camera using a ‘very’ simple/easy online applet (recipe). There may be several other ways to do this.
The motion sensor I used in this project to trigger an event was an HC-SR501 PIR IR passive infrared motion detector. It can also operate standalone with both sensitivity and timer adjustments. My application uses the sensitivity adjustment only, while the timer adjustment is set to minimum.
The motion detector was mounted on the end of the LED fixture and installed along the edge of my closet door. Two light fixtures were configured with the power provided inside and along the closet framing. In the Sonoff assembly, this sensor uses the +5V provided from the Sonoff (junction U2 and D5) by including the ‘hay’ wiring, as described in the photo. D0 is also wired and coded to allow a local Sonoff button to initiate a triggered event. The sensor power, in the ESP-01 assembly version, is powered directly from the 9 to 12 volt supply.
Web-based time is a convenience and if WiFi is integral to your application, as it is in this project, you may copy and past code from an assortment of library resources including with this project to substitute it as a real time clock. The time will be provided in GMT. It is included as a portion of the project recipe. While not used to create light timing (and since any delivered email would include a delivery related time tag), you may consider it a token function and include it as part of your tool box.
The voice commands are coded to use a Sinric based ‘lamp’ to configure the project’s mode, light, and message functions, per the table shown. The lamp commands can also be used to set the light delay after triggering, if enabled, from 10 to 99 seconds.
I created a smart device in my Alexa app called “Floor light.” Typical Alexa based configuration commands are: “Alexa set floor light to 4” and “Alexa set floor light to 25.”
For all commands codes, except 100, the app enable/disable and timing settings are saved in nonvolatile memory. The app was set up in an Alexa routine to disable the actions set (… to 1) at 8:00 AM and to re-enable set (… to 2 ,3, or 4) at 4:30 PM. Both the last command/timing related value and sensor enable status are visible on the Alexa and Sinric apps.
The project email is coded to provide an email, as shown below. The email destination field would be substituted with trigger@applet.IFTTT and the subject line would be changed to be : #ABCD, where ABCD is something of your own choice and entered in the IFTTT site.
Selecting the “Mail” trigger feature found on the IFTTT site then would trip the IFTTT selected action. The IFTTT is a free service and is based on a trigger initiating an action. The setup in the IFTTT screen is simple and there are several options to initiate this mechanism and hundreds of compatible devices and services that can be accessed.
Additive manufacturing (AM) still looms large in our science fiction imaginations in the form of Star Trek-like replication machines that will create just about anything (in between commercials) with the push of a button. The truth, however, is that additive manufacturing is still a highly complex process that requires a great deal of preparation and post-printing work—most of it manual. The high costs of such manual labor-intensive processes, therefore, has restricted metal AM to low-volume, high price tag parts, such as those for aircraft or surgical applications. Until automation improves, it will be a hard challenge to achieve large volume production, like that required for automotive or industrial applications, at reasonable costs. situs poker online indonesia
Many metal AM companies today are engaged in increased automation of the printers as well as post-production and maintenance tasks to further AM technology. Sweden’s Digital Metal, a proprietary binder-jetting AM technology company created by Höganäs AB, says it can deliver high levels of resolution for small objects combined with high surface quality, as well as unprecedented automation (“no-hand production”) that makes metal AM feasible for high-volume production. Digital Metal is known for producing small, high-volume components using its high-precision DM P2500 system.
“For AM to move into serial or mass production productivity, it needs to improve substantially,” Ralf Carlström, general manager for Digital Metal AB, told Design News. “Today, AM serial production is primarily found in segments with comparatively low volumes, like aerospace and medical implants, where the alternative cost is high. Moving on to high-volume segments like automotive will require productivity to increase substantially through more efficient processes.”
Graphene has shown its versatility for materials science, electronics, and numerous other scientific applications time and again. Now, researchers are using the so-called “wonder material” for a new purpose—to help them create inexpensive, durable, and mass-produced smart textiles.
An international team of scientists led by professor Monica Craciun at the University of Exeter in the United Kingdom has developed a technique that uses graphene to create electronics fibers that can be integrated into everyday clothing production.
Devices Fully Interwoven
The team has coated electronic fibers with light-weight, durable electronic components, allowing for electronic devices to be fully interwoven in the fabric of the material, said Craciun in a news release. bandar poker
Specifically, researchers used existing polypropylene fibers—already used in a host of commercial applications in the textile industry—to attach the new, graphene-based electronic fibers to create touch-sensor and light-emitting devices, she said.
This technique differs from many current solutions to develop wearable electronics, which basically glue devices to fabrics. These methods are flawed in that they produce rigid textiles that are prone to malfunction and aren’t very comfortable to those wearing them, she said.
The new technique means that the fabrics can incorporate wearable displays in clothing without the need for electrodes, wires, or additional materials, Craciun noted. “For truly wearable electronic devices to be achieved, it is vital that the components are able to be incorporated within the material, and not simply added to it.”
Craciun’s team not only included researchers from the University of Exeter, but also the universities of Aveiro and Lisbon in Portugal and CenTexBel in Belgium. Researchers published a paper on their work in the Nature Journal Flexible Electronics.
While wearable electronics and clothing with integrated electronic devices already exist, the researchers believe their invention can significantly expand the development of wearable electronic devices for everyday applications. They also can be used to diversify some of the devices already used in health monitoring, such as heart-rate, blood-pressure, and other diagnostic wearable devices.
Key to the design is graphene, which is the thinnest known substance for conducting electricity. Combining this with its inherent strength and flexibility makes it ideal for developing smart textiles, said Elias Torres Alonso, a research scientist at Graphenea and former PhD student on Craciun’s team at Exeter.
“This new research opens up the gateway for smart textiles to play a pivotal role in so many fields in the not-too-distant future,” he said. “By weaving the graphene fibers into the fabric, we have created a new technique to allow the full integration of electronics into textiles. The only limits from now are really within our own imagination.”
Elizabeth Montalbano is a freelance writer who has written about technology and culture for 20 years. She has lived and worked as a professional journalist in Phoenix, San Francisco, and New York City. In her free time, she enjoys surfing, traveling, music, yoga, and cooking. She currently resides in a village on the southwest coast of Portugal.
An ongoing megatrend in automation and control is the development of Industrial Ethernet technologies targeting the needs of process control applications. Integration of EtherNet/IP with the HART protocol, along with emerging FDI standards, are pieces of a strategic vision for manufacturers looking to maintain cost-effective, sustainable production capacity in the process industries. The potential is that this approach will simplify the exchange of configuration, diagnostic, and production data between field devices and higher-level systems used for supervisory control, data acquisition, and plant asset management. dominoqq
Integration of HART Devices
A clear example of these developments is ODVA’s recent announcement of enhancements to the EtherNet/IP specification, which outlines how to integrate HART devices into EtherNet/IP system control architectures. This capability is viewed as an important mechanism for process control users to connect with their existing infrastructure while leveraging the benefits of Industrial Ethernet.
“The integration of conventional HART I/Os is another step in fulfilling ODVA’s vision for the Optimization of Process Integration,” stated Olivier Wolff, chair of the ODVA technical working group for EtherNet/IP in the Process Industries, in a recent press release. “Now that the initial focus to integrate conventional field devices with industrial control systems and asset management systems is complete, the organization will continue to adapt EtherNet/IP to the full spectrum of process industries’ needs, including profiles for field devices to simplify device integration, diagnostics according to NE107, and comprehensive device configuration methods.”
At the recent ODVA technical conference, the working group reported on its efforts on integrating HART with the Common Industrial Protocol (CIP). With these new enhancements to the specifications, a CIP device can communicate with a HART device as if it’s a native CIP device, without requiring changes to the HART device or CIP systems used for industrial control.
Honeywell Process Solutions
At the conference, Brian Reynolds, senior director of engineering for Honeywell Process Solutions, presented the company’s rationale for embracing Industrial Ethernet solutions moving forward. Reynolds said that EtherNet/IP is an important platform for industrial control and, in the future, for overall digitization. The company is already using EtherNet/IP in its Connected Plant solutions to collect meaningful data from devices and to improve overall equipment effectiveness and safety. He said that, by joining ODVA as a principal member, Honeywell is “increasing its contribution to the advancement of EtherNet/IP and related ODVA technology and standards, in order to increase productivity, reliability, safety, security, and digitization in the process and hybrid industries.”
EtherNet/IP + FDI Standards
An ongoing development that is also shaping the development of Industrial Ethernet technologies for use in process applications is the combination of EtherNet/IP protocols, the FDI Device Integration standard, and OPC UA. According to Smitha Rao, co-founder of Utthunga Technologies, the huge installed base of EtherNet/IP devices in both hybrid and discrete industries gives it the potential to become a leading industrial communication protocol for process applications.
Rao added that FDI requirements driven by NAMUR and FDI’s flexible architecture provide an ability to add protocols without changing the host implementation, and will facilitate adding more protocols to the FDI standard in the coming years.
The NAMUR position paper, “An Ethernet communication system for the process industry,” calls for the EtherNet/IP (IEC 61784-2 CPF2/2) and PROFINET IO CC B protocols (IEC 61784-2 CPF3/5) to become minimum binding requirements for the process industry. It also recommends that FDI device packages required for Field Device Integration (FDI) be available in the devices and capable of being transmitted to central tools.
FieldComm Group has collaborated with the OPC Foundation to create an FDI Information Model, which provides harmonized protocol-agnostic data to the enterprise layer. The EtherNet/IP device-supporting FDI technology can be made available via the FDI Information model to manufacturing execution systems (MES) as well as enterprise applications, such as Enterprise Resource Planning (ERP) and Supply Chain Management (SCM).
Al Presher is a veteran contributing writer for Design News, covering automation and control, motion control, power transmission, robotics, and fluid power.
Developers may remember a time when you’d boot up your computer and all you’d get was a blank screen and blinking cursor. It was up to engineers and coders to build the content; the computer was just a platform. Ian Bernstein, founder and head of product at Misty Robotics, believes robots today are in that same place that computers were decades ago. “We’re at that same point with robots today, where people are just building robots over and over with Raspberry Pis and Arduinos,” Bernstein told Design News. ceme qiu
Bernstein is calling for a departure from thinking of robots as tools and machines to thinking of them more as platforms. Misty Robotics has designed its flagship robot of the same name, Misty, with that idea in mind. “It’s about giving people enough functionality to start to do useful things—but not too much, where it becomes too expensive or complicated,” Bernstein said. “It’s also about complexity. For developers, it is not approachable if you don’t know where to start.”
Boulder, Colorado-based Misty Robotics’ upcoming product, Misty II, is a 2-ft-tall, 6-lb robot. It is designed to do what the smartphone has done for mobile app developers, but for robotics engineers and makers—provide access to powerful features to open up the robot for a variety of applications. At its core, Misty II is driven by a deep learning processor capable of a variety of machine learning tasks, such as facial and object recognition, distance detection, spatial mapping, and sound and touch sensing. Developers can also 3D print (or even laser cut or CNC machine) custom parts to attach to Misty to expand its functionality for moving and manipulating objects. Misty II will also feature USB and serial connectors as well as an optional Arduino attachment to allow for hardware expansion with additional sensors and other peripherals. (One planned for release by the company is a thermal imaging camera.)
There are already several single-purpose robots available to consumers to use in the home. People will be most familiar with the Roomba robotic vacuum, but there are also robotic window washers, lawnmowers, security guards, and even pool cleaners currently available.
Speaking with Design News ahead of CES 2019, where Misty II was available for hands-on demonstrations, Bernstein said that, while the idea of a smart home full of connected robots all going about their various tasks sounds like the wave of the future, he doesn’t find this vision particularly feasible. “It’s not going to be economical to have single-purpose robots or eight different robots in your home,” he said. “A big part of that is cost. Robots require movements and motors, and you can’t bring that raw materials cost down.”
Rather than moving toward a world of a collaborative robot for every job, we should be heading toward having a singular cobot that can be configured for a plethora of tasks, he said.
The journey toward Misty II begins with Star Wars. But not just because of the inspiration that can be derived from characters like R2-D2 and C3-PO.
In 2014, Bernstein and his team were part of the Disney Accelerator program focused on supporting tech startups. While at Disney, Bernstein and his company (then called Orbotix) were working on a robot in a very simple form factor—a ball.
It was around this time that production was gearing up for Star Wars: The Force Awakens, the series entry that would introduce a new fan-favorite robot character, BB-8. While BB-8 was brought to life on-screen using puppeteering and other special effects, the team at Disney also wanted to create a real-life working model of BB-8. Orbotix’s work caught the attention of Disney CEO Bob Iger, who instantly recognized the team’s work as the solution to bringing BB-8 into the real world.
The time at Disney would allow Bernstein and his team to develop and release their first commercial product. They changed their company name to Sphero and released a spherical robot of the same name. Since its release, Sphero has found success as a consumer product and can be found in many stores. It has also found a home as an education product and has spawned a vibrant community of schools and educators that use it to teach STEM. Today, Sphero is used in over 10,000 schools worldwide, according to Bernstein.
There is also, of course, a toy model of BB-8 that is essentially a Sphero with a BB-8 skin on top.
There Is No Killer App
But you can’t spend any amount of time with the team making the next Star Wars without picking up some new ideas. “Disney got us thinking about adding personality and story elements to our robots with Sphero,” Bernstein said. “We started thinking about what could really be done with robots. We had prototyped some more advanced robots at Sphero—telepresence robots and things like that—but they didn’t feel quite right.”
What the Sphero team was searching for was how to create a product to which users would feel a deep, personal connection. R2-D2 and BB-8 are entertainment, but why couldn’t they make the real version? “It was thinking about this idea of a robot in every home and office, and why couldn’t a robot do useful things and have a personality and character?” Bernstein said.