Connect with us

Biotech

Long-lived Things for the Internet of Things

Avatar

Published

on

Even the most frugal among us upgrade our cell phones and laptops every now and again. They’re relatively easy to replace.

Credit: Tom Cogill/University of Virginia School of Engineering and Applied Science

Even the most frugal among us upgrade our cell phones and laptops every now and again. They’re relatively easy to replace.

That’s not true of the internet-connected sensors increasingly installed in buildings and other structures. Small monitoring devices in hard-to-reach places continuously collect and communicate data – everything from temperature and humidity levels in office buildings to the pressure on bridges.

Sensors make up a sizable slice of the broader network researchers call the “internet of things,” comprising the billions of connected products that make our cars, homes, businesses and cities more efficient. These networks promise enormous societal benefits, such as medical sensors that help doctors give patients better care and environmental sensors that help people save energy.

But one big question holds the technology back from reaching its potential: How do you keep all those tiny computers working through the march of time?

“Internet of things devices are often monitoring infrastructure we expect to last for decades, or centuries. We don’t replace all of our buildings every five years,” said Brad Campbell, an assistant professor of computer science and electrical and computer engineering at the University of Virginia School of Engineering and Applied Science.

He has earned a five-year, $700,000 National Science Foundation CAREER Award for his research, “Repurposable Devices for a Greener Internet of Things,” to answer the question.

The CAREER program, one of the NSF’s most prestigious awards for early-career faculty, recognizes the recipient’s potential for leadership in research and education. Campbell is a member of UVA Engineering’s Link Lab, an interdisciplinary cyber-physical systems research center where he is one of several faculty on the leading edge of research focused on how humans can use connected systems to interact with their surroundings.

Consider a smart-lighting system that turns lights on and off in a building depending upon where people are moving around inside. One building may use up to 30,000 sensors. Fully exploiting the good things that connected devices could do means devices could quickly run into the trillions. There’s no practical way to upgrade or replace all of them with the same frequency we do personal devices.

Imagine just trying to keep batteries in the sensors – one of the biggest maintenance headaches of all, and one Campbell has lots of experience with going back to his days doing research for his Ph.D. in computer science at the University of Michigan.

“Not to mention what do we do with all those old devices that were working reasonably well, but they maybe just didn’t keep up with the times? Now those basically become e-waste,” Campbell said, adding that, unlike larger computers and appliances, small electronic devices are difficult to recycle.

“What we need is something that can let us use already deployed devices and keep them useful decades into the future,” Campbell said. “But it’s challenging because the arc of technology doesn’t go backward. We don’t tend to want things with fewer features, or that are less secure or that provide less utility. We always want more, more, more.”

Campbell and his team, including Ph.D. students Nurani Saoda and Nabeel Nasir, are designing a new class of sensors and what they call the “ecosystem” in which the sensors will operate. By “ecosystem” Campbell means software the team is developing to run on existing commercial hardware platforms, such as the Raspberry Pi. The goal is that the sensors would be capable of adapting to whatever the future brings.

The first hurdle is powering the sensors without wires or batteries and without knowing what their surroundings will be in 20 years.

The researchers already know how to design sensors to draw enough energy from nearby sources, such as the sun, indoor lighting or vibrations, to operate. But those sources might not always be there as building occupants and uses change.

“Part of this project is a new design for the energy-harvesting power supply that can encapsulate that complexity, so it can manage what happens when the energy characteristics and needs change,” said Nurani, who is leading the power supply work. “That frees up the application-level processor to focus on the sensing task, which simplifies development and creates more adaptive devices.”

A second challenge is adding software on old networked devices to perform new tasks without crashing existing applications.

“The status quo today is you just update everything,” Campbell said. “You replace all of the code that’s running on your devices, and if something goes wrong, maybe there’s a way to revert back to the old version, or if there’s a bug, hopefully you can update it again.”

The older the hardware gets, the scarier updates become – until operators must ignore any new security issues and are stuck with the device’s current operation.

To solve the problem, Campbell’s team is working on new software architectures to make the devices’ software configuration modular, essentially isolating software components from one another. The components can then be upgraded individually, without reprogramming the entire device.

This software modularity is one of the methods developed from the project Campbell will incorporate into graduate and undergraduate courses for the educational piece of the CAREER Award. He is also looking for graduate students from backgrounds historically underrepresented in engineering and first-year undergraduates to collaborate on the project.

The goal is training engineers skilled in fundamental techniques who understand the intersection of the internet of things and its cross-disciplinary applications, as well as the ethical implications for all stakeholders.

The team will address one more limitation on the functional lifespan of small connected devices: “computational obsolescence.” As hardware improvements make it possible to get more performance and run increasingly complex software using the same amount of energy, older devices’ computing powers diminish relative to the new demands.

Campbell looked to the way wireless communication technology prioritizes backward compatibility. For example, a modern Bluetooth 5 device can still pair with a Bluetooth device from the mid-2000s.

“This suggests that while today’s microcontrollers may not be sufficient for tomorrow’s software, today’s devices will be able to communicate for decades to come,” Campbell wrote in his CAREER Award proposal.

The insight led to the idea of offloading tasks too complex for an aging sensor to a nearby gateway hub capable of doing the job. Because there will be far fewer gateways – roughly five for every 300 sensors – they’re feasible to upgrade or replace as technology evolves.

It’s kind of like using a 2007 iPhone in 2027 – with all the speed and functionality of the latest model.

Nasir is developing the gateway software to support the sensors.

“In effect, it’s as if we have the newest hardware deployed for the devices themselves, but without having to actually replace a huge number of sensors,” Nasir said.

Importantly to Campbell, future application developers who want to add new functions to the sensors should never have think about or even be aware of the transaction between the sensors and gateway. He wants to remove barriers preventing decision-makers across all sectors of society from maximizing the benefits of the internet of things.

“Having these devices not follow the path of smartphones where we replace them all the time, I think is important,” Campbell said. “But enabling better-operated equipment, better-managed buildings, more efficient infrastructure is also important.

“We need to find ways to not just have the individual technology pieces or the components or the devices themselves, but the larger ecosystem actually scalable,” he said. “That’s what led to this idea of repurposable devices.’

Read More

Article: bioengineer.org

Biotech

Ovarian Cancer in the Fatty Omentum: Metabolic Enzyme’s Key Role in Tumor Metastasis

Avatar

Published

on

(LOS ANGELES) – July 1, 2022 – In their recent publication in Cell Reports, a team of scientists, led by Xiling Shen, Ph.D., Chief Scientific Officer at the Terasaki Institute for Biomedical Innovation (TIBI), has demonstrated the pivotal role of an enzyme, glucose-6-phosphate dehydrogenase (G6PD), in facilitating ovarian cancer (OC) growth and metastasis in the omentum, a curtain of fatty tissue found in the abdominal cavity.

(LOS ANGELES) – July 1, 2022 – In their recent publication in Cell Reports, a team of scientists, led by Xiling Shen, Ph.D., Chief Scientific Officer at the Terasaki Institute for Biomedical Innovation (TIBI), has demonstrated the pivotal role of an enzyme, glucose-6-phosphate dehydrogenase (G6PD), in facilitating ovarian cancer (OC) growth and metastasis in the omentum, a curtain of fatty tissue found in the abdominal cavity.

OC is a particularly deadly metastatic disease, with stage III or higher diagnoses occurring in 80% of patients, along with approximately 30% five-year survival rates. OC often shows a particular preference for migrating to and aggressively proliferating in the omentum, which provides fatty acids as a fuel source for OC cells.

As a part of this increase in fatty acid metabolism by the OC cells, certain oxidative compounds are produced, which impose a degree of oxidative stress in the omental microenvironment. As a result, a metabolic pathway called the pentose phosphate pathway (PPP) is activated, which not only serves as a counteractive response to this stress but is also an essential part of certain metabolisms in cancer cells.

Although it is known that G6PD is the rate-controlling enzyme in the PPP, its effects on OC metastasis in the omentum had not been previously examined. Dr. Shen’s team has shed light on this question by conducting a series of revealing experiments.

Genetic and metabolic analyses revealed elevated levels of PPP oxidative compounds and metabolites, including G6PD, in the omental metastases compared to primary tumors in OC patients. Similar observations were made in mice injected with different OC cell lines and in OC cells or organoids cultured in media conditioned with omental tissue.  

These initial experiments confirmed the OM OC cells’ PPP response to oxidative stress generated by omental fatty acid metabolism. The elevated levels of G6PD observed in these samples provided a link. Ensuing inhibition experiments definitively demonstrated G6PD’s influence on OM OC cells. Genetic silencing or pharmacological inhibition of G6PD induced significant cell death and increased levels of key oxidative compounds in the cells grown in omental conditioned media compared with the others. The results from these experiments illustrated the need for the presence of G6PD to activate the PPP in order to counteract the omental production of oxidative compounds.

This observation was further confirmed by in vivo studies in mice injected with genetically altered G6PD-inhibited OMC cells or treated with the G6PD-inhibiting drug, which resulted in much smaller metastatic tumors in the omentum.

Taken together, the results signify that G6PD is an essential component used to offset the oxidative stresses created from fatty acid metabolism by OC cells in the omentum. Without this enzyme, the PPP cannot function, and the metastatic cells succumb to the resultant buildup oxidative compounds.

“Elucidation of the metabolic interplay which influences tumor survival and metastasis increases the potential for targeted therapeutic development,” said Ali Khademhosseini, Ph.D., TIBI’s Director and CEO. “This work is a step in that direction and has significant clinical relevance for aggressively metastatic disease like ovarian cancer.”

PRESS CONTACT

Stewart Han, [email protected], +1 818-836-4393

Terasaki Institute for Biomedical Innovation

###

The Terasaki Institute for Biomedical Innovation (terasaki.org) is a non-profit research organizationthat invents and fosters practical solutions that restore or enhance the health of individuals.  Research at the Terasaki Institute leverages scientific advancements that enable an understanding of what makes each person unique, from the macroscale of human tissues down to the microscale of genes, to create technological solutions for some of the most pressing medical problems of our time.  We use innovative technology platforms to study human disease on the level of individual patients by incorporating advanced computational and tissue-engineering methods.  Findings yielded by these studies are translated by our research teams into tailored diagnostic and therapeutic approaches encompassing personalized materials, cells and implants with unique potential and broad applicability to a variety of diseases, disorders, and injuries. 

The Institute is made possible through an endowment from the late Dr. Paul I Terasaki, a pioneer in the field of organ transplant technology.

Authors are Shree Bose, Qiang Huang, Yunhan Ma, Lihua Wang, Grecia O. Rivera, Yunxin Ouyang, Regina Whitaker, Rebecca A. Gibson, Christopher D. Kontos, Andrew Berchuck, Rebecca A. Previs, Xiling Shen

This work was supported by National Cancer Institute grants NIH-U01CA217514 and 289 U01CA214300, as well as National Institutes of Health F30 fellowship 1F30CA257365- 290 01.

Read More

Original Article: bioengineer.org

Continue Reading

Biotech

New Screening Technique Could Accelerate and Improve MRNA Therapies

Avatar

Published

on

Therapeutics based on messenger RNA, or mRNA, can potentially treat a wide range of maladies, including cancer, genetic diseases, and as the world has learned in recent years, deadly viruses.

Therapeutics based on messenger RNA, or mRNA, can potentially treat a wide range of maladies, including cancer, genetic diseases, and as the world has learned in recent years, deadly viruses.

To work, these drugs must be delivered directly to target cells in nanoscale bubbles of fat called lipid nanoparticles, or LNPs — mRNA isn’t much good if doesn’t reach the right cell type. 

A team of researchers at the Georgia Institute of Technology and Emory University’s School of Medicine has taken another step toward improving development of these custom-made delivery vehicles, reporting their work June 30 in Nature Nanotechnology. Curtis Dobrowolski and Kalina Paunovska, trainees in the lab of James Dahlman, have developed a system to make pre-clinical nanoparticle studies more predictive. Their discoveries already are influencing the direction of research in this growing, competitive field.

“I’m very excited about this study and anticipate shifting most of our future projects to this methodology,” said Dahlman, associate professor and McCamish Foundation Early Career Professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory. 

Sequencing of Events

For the past few years, Dahlman, has partnered with Coulter BME Professor Philip Santangelo in a busy research enterprise. Santangelo’s lab develops mRNA therapies, and Dahlman’s lab delivers it using LNPs.

To speed up the process of testing the effectiveness of their LNPs, Dahlman’s team has developed a technique called DNA barcoding. In this process, researchers insert a snippet of DNA that corresponds to a given LNP. The LNPs are then injected and cells are subsequently examined for the presence of the “barcodes” using genetic sequencing. The system identifies which barcodes have reached which specific targets, highlighting the most promising nanoparticles. Since many DNA sequences can be read at once, the barcoding process allows many experiments to be performed simultaneously, thereby accelerating the discovery of effective lipid nanoparticle carriers.

DNA barcoding has significantly improved the nanoparticle pre-clinical screening process. But there is still a significant barrier impacting drug delivery. Because of their diversity, cells are kind of like moving targets. Dahlman noted that cells previously thought to be homogeneous are composed of distinct and varied cell subsets. His team surmised that this chemical and genetic heterogeneity has a powerful influence on how well LNPs can deliver mRNA therapies into the cells.

“Cells don’t have just one protein that defines them — they’re complicated,” Dahlman said. “They can be defined by a combination of things, and if we’re being honest, they are best defined using by all the genes they do, or do not, express.”

To test their hypothesis, the researchers developed a new tool to measure all of these things at once. Their multiomic nanoparticle delivery system is called single-cell nanoparticle targeting-sequencing, or SENT-seq. 

Multiomics Approach

Using SENT-seq, the researchers were able to quantify how LNPs deliver DNA barcodes and mRNA into cells, the subsequent protein production facilitated by the mRNA drug, as well as the identity of the cell, in thousands of individual cells. 

This multiomics approach could represent an important leap forward for high-throughput LNP discovery. The SENT-seq technique allowed the team to identify cell subtypes that demonstrate particularly high or low nanoparticle uptake, and the genes associated with those subtypes. 

So, in addition to testing the efficacy of a drug and how certain cell subtypes react to nanoparticles, they’re identifying which genes are involved in the successful uptake of LNPs. And they’re doing it all at once.

“The data suggests that these different cell subsets have distinct responses to nanoparticles that influence how well an mRNA therapy works,” Dahlman said. “There’s still a lot of work to be done, but we think the ability to simultaneously read out high-throughput nanoparticle delivery and the cellular response to nanoparticles will lead to better mRNA therapies.”

Co-lead author Paunovska said that she and Dobrowolski came up with the idea for the SENT-seq system, “organically, after two months of working together.”

Dahlman added: “I’m proud of the work that Curtis, Kalina, and the team did in the lab. I think this is the beginning of an extremely interesting phase in our work.”

This research was supported by the National Institutes of Health, grant Nos. UG3-TR002855 and R01DE026941. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of any funding agency.

Read More

Article: bioengineer.org

Continue Reading

Biotech

Novel Sewage Treatment System Removes up to 70% of Nitrogen That Would Otherwise Be Discarded Into Nature

Avatar

Published

on

 A new type of biofilm reactor adapted to Brazilian conditions and using polyurethane foam to lower costs can reduce the amount of nitrogen compounds in wastewater by as much as 70%, according to an article in Environmental Technology. The researchers who conducted the study developed a mathematical model to analyze and predict the nitrogen removal mechanism. The biofilm comprised bacteria that converted nitrogen compounds into nitrogen gas, which is environmentally harmless. 

 A new type of biofilm reactor adapted to Brazilian conditions and using polyurethane foam to lower costs can reduce the amount of nitrogen compounds in wastewater by as much as 70%, according to an article in Environmental Technology. The researchers who conducted the study developed a mathematical model to analyze and predict the nitrogen removal mechanism. The biofilm comprised bacteria that converted nitrogen compounds into nitrogen gas, which is environmentally harmless. 

The study was led by Bruno Garcia Silva during his doctoral research in hydraulic engineering and sanitation at the University São Paulo (USP) in Brazil, with Eugenio Foresti as thesis advisor. Foresti is a professor at the São Carlos School of Engineering (EESC-USP). The study was supported by FAPESP

The article was one of the results of the Thematic Project “Biorefinery concept applied to biological wastewater treatment plants: environmental pollution control coupled with material and energy recovery”, for which Marcelo Zaiat, also a professor at EESC-USP, was principal investigator. Researchers at the Federal University of São Carlos (UFSCar) and Mauá Institute of Technology (IMT) collaborated. 

“Nitrogen removal is still achieved by only a few wastewater treatment plants in Brazil, whereas it’s regularly performed in Europe and the United States,” Garcia told Agência FAPESP. “The idea is to adapt [the necessary infrastructure] to our reality. The usual method here is based on anaerobic reactors, which produce effluent with low levels of organic matter, making nitrogen removal difficult.”

Removal of nitrogen compounds (nitrite, nitrate and ammonia, among others) from both domestic sewage and industrial wastewater is essential because they contaminate surface water (lakes, reservoirs and streams) as well as aquifers and other ground water, letting the growth of bacteria, algae and plants spiral out of control in a process known as eutrophication.

Furthermore, consumption of water contaminated by nitrate can lead to diseases such as infant methemoglobinemia (blue baby syndrome), which causes headache, dizziness, fatigue, lethargy, breathlessness, and neurological alterations such as seizures and coma in severe cases. 

“When algal blooms proliferate, as seen in reservoirs like Billings [one of the main water sources for São Paulo], for example, lack of oxygen in the water leads to the death of fish and loss of water supply as well as leisure areas. It’s very hard to remove algae from reservoirs,” said Foresti, who leads the group.

Differentiators

One of the key differentiators of this new reactor model is the biofilm formed by a biological process in which bacteria create a film on the polyurethane foam. Another is the configuration of the equipment to permit what the researchers call counterdiffusion, where oxygen is introduced on the opposite side to the contaminants.

“Oxygen is transported into the foam because this ensures that it remains only where it’s needed for the reaction to occur,” Garcia explained. “We don’t want oxygen to come into contact with organic matter all the time. If it did, the bacteria would use up all the oxygen to break it down and nothing would be left over to consume the nitrite and nitrate. So we insert the oxygen on the other side of the biofilm. The goal is for the organic matter that reaches the biofilm on the opposite side to be oxidized not just by oxygen but also by nitrite and nitrate.” 

When oxygen does not enter the reactor, the ammonia remains unchanged. When ammonia enters the site of the reactor with oxygen input, however, it is converted into nitrite and nitrate. “The only way out is via the biofilm, and the compounds cross this barrier by diffusion in the opposite direction to the organic matter. Their collision with organic matter in contraflow creates optimal conditions for nitrite and nitrate removal because there’s no longer any oxygen and there’s enough organic matter for denitrification,” Garcia said.

Foresti explained that in Brazil, anaerobic reactors (which break down organic matter using bacteria that do not require oxygen to survive) are increasingly being used by municipal wastewater treatment companies because of the predominant climate, which is warmer than that of the northern hemisphere. Bacteria decompose organic matter faster in warm weather. In Europe and the US, where mean temperatures are lower, the process is different. The organic matter present in the liquid phase after sludge removal is oxidized aerobically (by oxygen). 

In Brazil, however, nitrogen compounds are not completely removed for cost reasons and are directly released into nature. The new type of reactor developed by the researchers is designed to add a second, easier and cheaper, stage to wastewater treatment, for development with future technologies and partnerships.

Scholarship for research in the US 

Researchers who work at the laboratory of Robert Nerenberg, a professor at the University of Notre Dame in the US, collaborated with Garcia, who was there as a visiting researcher in 2019-20 with FAPESP’s support.  

“The difference between my project and theirs is that instead of polyurethane foam they use a semipermeable membrane, which resembles a drinking straw full of air. When this capillary comes into contact with water, it lets through oxygen but not water, so that the biofilm sticks to the surface and grows on it. In other words, oxygen is supplied to the bacteria through the walls of this thin tube. The oxygen comes out, and the water provides ammonia and organic matter. It’s the same system as counterdiffusion, except that the material we use is simpler and cheaper,” Garcia said.

“The bacteria grow on the surface to form a biofilm, but it’s not a filter properly speaking because it doesn’t offer mechanical resistance to the passage of particles. What the reactor does in fact is serve as a support for the bacteria to grow and consume soluble organic matter and nitrogen compounds.”

Next steps

According to Foresti, the new configuration of the reactor is inspiring further research by the group. In a program of cooperation between the São Paulo State Basic Sanitation Corporation (SABESP) and FAPESP, the researchers plan to test the new model with real sewage that has been through an aerobic reactor in the treatment plant operated by SAAE, the municipal sanitation service in São Carlos. Researchers at UFSCar and IMT are also part of the program and will develop other systems to be tested.

“Bruno’s research is the first to use counterdiffusion in this way here in Brazil,” Foresti said. “It’s proof of concept for synthetic wastewater. The efficiency found in this reactor configuration was greatly superior to that observed in previous research, but we still need to evaluate several factors.” 

The new configuration has been tested in the laboratory. Efficiency will be measured in further projects, as it is not possible to predict how the equipment will behave when processing large volumes of effluent, and the system needs to be tested with actual domestic sewage and industrial wastewater. Hitherto it has been tested only on samples of synthetic waste prepared by the researchers themselves.

“We may have to improve the design and geometry,” Garcia said. “How can the design be optimized to obtain the largest optimal surface area per reactor volume so as to lower the cost? The study provides a basis, a foundation on which we can go on thinking about the process and the mathematical tool.”         

###

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.

Read More

Original Article: bioengineer.org

Continue Reading

Trending

DXDD.com