Still going strong: Pensioner David Latimer from Cranleigh, Surrey, with his bottle garden that was first planted 53 years ago and has not been watered since 1972 - yet continues to thrive in its sealed environment. (Click Image To Enlarge)
David Latimer first planted his bottle garden in 1960 and last watered it in 1972 before tightly sealing it shut 'as an experiment'.
The hardy spiderworts plant inside has grown to fill the 10-gallon container by surviving entirely on recycled air, nutrients and water.
Gardeners' Question Time expert says it is 'a great example just how pioneering plants can be'.
To look at this flourishing mass of plant life you’d think David Latimer was a green-fingered genius.
Truth be told, however, his bottle garden – now almost in its 53rd year – hasn’t taken up much of his time.
In fact, on the last occasion he watered it Ted Heath was Prime Minister and Richard Nixon was in the White House.
Lush: Just like any other plant, Mr Latimers's bottled specimen has survived and thrived using the cycle of photosynthesis despite being cut off from the outside world. (Click Image To Enlarge)
HOW THE BOTTLE GARDEN GROWS
Bottle gardens work because their sealed space creates an entirely self-sufficient ecosystem in which plants can survive by using photosynthesis to recycle nutrients.
The only external input needed to keep the plant going is light, since this provides it with the energy it needs to create its own food and continue to grow.
Light shining on the leaves of the plant is absorbed by proteins containing chlorophylls (a green pigment).
Some of that light energy is stored in the form of adenosine triphosphate (ATP), a molecule that stores energy. The rest is used to remove electrons from the water being absorbed from the soil through the plant's roots.
These electrons then become 'free' - and are used in chemical reactions that convert carbon dioxide into carbohydrates, releasing oxygen.
This photosynthesis process is the opposite of the cellular respiration that occurs in other organisms, including humans, where carbohydrates containing energy react with oxygen to produce carbon dioxide, water, and release chemical energy.
But the eco-system also uses cellular respiration to break down decaying material shed by the plant. In this part of the process, bacteria inside the soil of the bottle garden absorbs the plant's waste oxygen and releasing carbon dioxide which the growing plant can reuse.
And, of course, at night, when there is no sunlight to drive photosynthesis, the plant will also use cellular respiration to keep itself alive by breaking down the stored nutrients.
Because the bottle garden is a closed environment, that means its water cycle is also a self-contained process.
The water in the bottle gets taken up by plants’ roots, is released into the air during transpiration, condenses down into the potting mixture, where the cycle begins again.
The Nobel Prize in Physics 2013 was awarded jointly to François Englert (left) and Peter W. Higgs(right) "for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN's Large Hadron Collider"
Click Images To Enlarge
François Englert and Peter W. Higgs are jointly awarded the Nobel Prize in Physics 2013 for the theory of how particles acquire mass. In 1964, they proposed the theory independently of each other (Englert together with his now deceased colleague Robert Brout). In 2012, their ideas were confirmed by the discovery of a so called Higgs particle at the CERN laboratory outside Geneva in Switzerland..
The awarded theory is a central part of the Standard Model of particle physics that describes how the world is constructed. According to the Standard Model, everything, from flowers and people to stars and planets, consists of just a few building blocks: matter particles. These particles are governed by forces mediated by force particles that make sure everything works as it should.
The entire Standard Model also rests on the existence of a special kind of particle: the Higgs particle. This particle originates from an invisible field that fills up all space. Even when the universe seems empty this field is there. Without it, we would not exist, because it is from contact with the field that particles acquire mass. The theory proposed by Englert and Higgs describes this process.
On 4 July 2012, at the CERN laboratory for particle physics, the theory was confirmed by the discovery of a Higgs particle. CERN’s particle collider, LHC (Large Hadron Collider), is probably the largest and the most complex machine ever constructed by humans. Two research groups of some 3,000 scientists each, ATLAS and CMS, managed to extract the Higgs particle from billions of particle collisions in the LHC.
Even though it is a great achievement to have found the Higgs particle — the missing piece in the Standard Model puzzle — the Standard Model is not the final piece in the cosmic puzzle. One of the reasons for this is that the Standard Model treats certain particles, neutrinos, as being virtually massless, whereas recent studies show that they actually do have mass. Another reason is that the model only describes visible matter, which only accounts for one fifth of all matter in the cosmos. To find the mysterious dark matter is one of the objectives as scientists continue the chase of unknown particles at CERN.
François Baron Englert was born in 1932 and is a Belgian theoretical physicist and 2013 Novel prize laureate (shared with Peter Higgs). He is Professor emeritus at the Universite libre de Bruxelles (ULB) where he is member of the Service de Physique Théorique. He is also a Sackler Professor by Special Appointment in the School of Physics and Astronomy at Tel Aviv University and a member of the Institute for Quantum Studies at Chapman University in California. He was awarded the 2010 J.J. Sakurai Prize for Theoretical Particle Physics (with Gerry Guralnik, C.R. Hagen, Tom Kibble, Peter Higgs and Robert Brout), the Wolf Prize in Physics in 2004 (with Brout and Higgs) and the High Energy and Particle Prize of the European Physical Society (with Brout and Higgs) in 1997 for the mechanism which unifies short and long range interactions by generating massive gauge vector bosons. He has made contributions in statistical physics, quantum field theory, cosmology, string theory and supergravity. He is the recipient of the 2013 Prince of Asturias Award in technical and scientific research, together with Peter Higgs and the CERN.
Peter W. Higgs CH, FRS, FRSE was born in 1929 and is a British theoretical physicist, Nobel laureate and emeritus professor at the University of Edinburgh. He is best known for his 1960s proposal of broken symmetry in electroweak theory, explaining the origin of mass of elementary particles in general and of the W and Z bosons in particular. This so-called Higgs mechanism, which was proposed by several physicists besides Higgs at about the same time, predicts the existence of a new particle, the Higgs boson (which was often described as "the most sought-after particle in modern physics". CERN announced on 4 July 2012 that they had experimentally established the existence of a Higgs-like boson, but further work is needed to analyse its properties and see if it has the properties expected from the Standard Model Higgs boson. On 14 March 2013, the newly discovered particle was tentatively confirmed to be + parity and zero spin, two fundamental criteria of a Higgs boson, making it the first known fundamental scalar particle to be discovered in nature (although previously, composite scalars such as the K had been observed over half a century prior). The Higgs mechanism is generally accepted as an important ingredient in the Standard Model of particle physics, without which certain particles would have no mass.
Nobel Prize in Chemistry for 2013
The Nobel Prize in Chemistry 2013 was awarded jointly to Martin Karplus (left), Michael Levitt (middle) and Arieh Warshel (right) "for the development of multiscale models for complex chemical systems".
Click Images To Enlarge
Chemists used to create models of molecules using plastic balls and sticks. Today, the modelling is carried out in computers. In the 1970s, Martin Karplus, Michael Levitt and Arieh Warshel laid the foundation for the powerful programs that are used to understand and predict chemical processes. Computer models mirroring real life have become crucial for most advances made in chemistry today.
Chemical reactions occur at lightning speed. In a fraction of a millisecond, electrons jump from one atomic to the other. Classical chemistry has a hard time keeping up; it is virtually impossible to experimentally map every little step in a chemical process. Aided by the methods now awarded with the Nobel Prize in Chemistry, scientists let computers unveil chemical processes, such as a catalyst’s purification of exhaust fumes or the photosynthesis in green leaves.
The work of Karplus, Levitt and Warshel is ground-breaking in that they managed to make Newton’s classical physics work side-by-side with the fundamentally different quantum physics. Previously, chemists had to choose to use either or. The strength of classical physics was that calculations were simple and could be used to model really large molecules. Its weakness, it offered no way to simulate chemical reactions. For that purpose, chemists instead had to use quantum physics. But such calculations required enormous computing power and could therefore only be carried out for small molecules.
This year’s Nobel Laureates in chemistry took the best from both worlds and devised methods that use both classical and quantum physics. For instance, in simulations of how a drug couples to its target protein in the body, the computer performs quantum theoretical calculations on those atoms in the target protein that interact with the drug. The rest of the large protein is simulated using less demanding classical physics.
Today the computer is just as important a tool for chemists as the test tube. Simulations are so realistic that they predict the outcome of traditional experiments.
Martin Karplus was born in 1930 and is an Austrian-born American theoretical chemist. He is the Theodore William Richards Professor of Chemistry, emeritus at Harvard University. He is also Director of the Biophysical Chemistry Laboratory, a joint laboratory between the French National Center for Scientific Research and the University of Strasbourg, France. Karplus received the 2013 Nobel Prize in Chemistry, together with Michael Levitt and Arieh Warshel, for "the development of multiscale models for complex chemical systems".
Michael Levitt, FRS was born in 1947 and is an American-British-Israeli biophysicist and a professor of structural biology at Stanford University, a position he has held since 1987. His research is in computational biology and he is a member of the National Academy of Sciences. Levitt received the 2013 Nobel Prize in Chemistry, together with Martin Karplus and Arieh Warshel, for "the development of multiscale models for complex chemical systems".
Arieh Warshel (Hebrew: אריה ורשל, was born in 1940 and is an Israeli-American Distinguished Professor of Chemistry and Biochemistry at the University of Southern California. He received the 2013 Nobel Prize in Chemistry, together with Michael Levitt and Martin Karplus for "the development of multiscale models for complex chemical systems".
Nobel Prize in Medicine for 2013
The Nobel Prize in Physiology or Medicine 2013 was awarded jointly to James E. Rothman (left), Randy W. Schekman (middle) and Thomas C. Südhof (right) "for their discoveries of machinery regulating vesicle traffic, a major transport system in our cells".
Click Images To Enlarge
The 2013 Nobel Prize was awarded jointly to three scientists who have solved the mystery of how the cell organizes its transport system. Each cell is a factory that produces and exports molecules. For instance, insulin is manufactured and released into the blood and signaling molecules called neurotransmitters are sent from one nerve cell to another. These molecules are transported around the cell in small packages called vesicles. The three Nobel Laureates have discovered the molecular principles that govern how this cargo is delivered to the right place at the right time in the cell.
Randy Schekman discovered a set of genes that were required for vesicle traffic. James Rothman unravelled protein machinery that allows vesicles to fuse with their targets to permit transfer of cargo. Thomas Südhof revealed how signals instruct vesicles to release their cargo with precision.
Through their discoveries, Rothman, Schekman and Südhof have revealed the exquisitely precise control system for the transport and delivery of cellular cargo. Disturbances in this system have deleterious effects and contribute to conditions such as neurological diseases, diabetes, and immunological disorders.
How cargo is transported in the cell
In a large and busy port, systems are required to ensure that the correct cargo is shipped to the correct destination at the right time. The cell, with its different compartments called organelles, faces a similar problem: cells produce molecules such as hormones, neurotransmitters, cytokines and enzymes that have to be delivered to other places inside the cell, or exported out of the cell, at exactly the right moment. Timing and location are everything. Miniature bubble-like vesicles, surrounded by membranes, shuttle the cargo between organelles or fuse with the outer membrane of the cell and release their cargo to the outside. This is of major importance, as it triggers nerve activation in the case of transmitter substances, or controls metabolism in the case of hormones. How do these vesicles know where and when to deliver their cargo?
Traffic congestion reveals genetic controllers
Randy Schekman was fascinated by how the cell organizes its transport system and in the 1970s decided to study its genetic basis by using yeast as a model system. In a genetic screen, he identified yeast cells with defective transport machinery, giving rise to a situation resembling a poorly planned public transport system. Vesicles piled up in certain parts of the cell. He found that the cause of this congestion was genetic and went on to identify the mutated genes. Schekman identified three classes of genes that control different facets of the cell´s transport system, thereby providing new insights into the tightly regulated machinery that mediates vesicle transport in the cell.
Docking with precision
James Rothman was also intrigued by the nature of the cell´s transport system. When studying vesicle transport in mammalian cells in the 1980s and 1990s, Rothman discovered that a protein complex enables vesicles to dock and fuse with their target membranes. In the fusion process, proteins on the vesicles and target membranes bind to each other like the two sides of a zipper. The fact that there are many such proteins and that they bind only in specific combinations ensures that cargo is delivered to a precise location. The same principle operates inside the cell and when a vesicle binds to the cell´s outer membrane to release its contents.
It turned out that some of the genes Schekman had discovered in yeast coded for proteins corresponding to those Rothman identified in mammals, revealing an ancient evolutionary origin of the transport system. Collectively, they mapped critical components of the cell´s transport machinery.
Timing is everything
Thomas Südhof was interested in how nerve cells communicate with one another in the brain. The signalling molecules, neurotransmitters, are released from vesicles that fuse with the outer membrane of nerve cells by using the machinery discovered by Rothman and Schekman. But these vesicles are only allowed to release their contents when the nerve cell signals to its neighbours. How is this release controlled in such a precise manner? Calcium ions were known to be involved in this process and in the 1990s, Südhof searched for calcium sensitive proteins in nerve cells. He identified molecular machinery that responds to an influx of calcium ions and directs neighbour proteins rapidly to bind vesicles to the outer membrane of the nerve cell. The zipper opens up and signal substances are released. Südhof´s discovery explained how temporal precision is achieved and how vesicles´ contents can be released on command.
Vesicle transport gives insight into disease processes
The three Nobel Laureates have discovered a fundamental process in cell physiology. These discoveries have had a major impact on our understanding of how cargo is delivered with timing and precision within and outside the cell. Vesicle transport and fusion operate, with the same general principles, in organisms as different as yeast and man. The system is critical for a variety of physiological processes in which vesicle fusion must be controlled, ranging from signalling in the brain to release of hormones and immune cytokines. Defective vesicle transport occurs in a variety of diseases including a number of neurological and immunological disorders, as well as in diabetes. Without this wonderfully precise organization, the cell would lapse into chaos.
James E. Rothman was born 1950 in Haverhill, Massachusetts, USA. He received his PhD from Harvard Medical School in 1976, was a postdoctoral fellow at Massachusetts Institute of Technology, and moved in 1978 to Stanford University in California, where he started his research on the vesicles of the cell. Rothman has also worked at Princeton University, Memorial Sloan-Kettering Cancer Institute and Columbia University. In 2008, he joined the faculty of Yale University in New Haven, Connecticut, USA, where he is currently Professor and Chairman in the Department of Cell Biology.
Randy W. Schekman was born 1948 in St Paul, Minnesota, USA, studied at the University of California in Los Angeles and at Stanford University, where he obtained his PhD in 1974 under the supervision of Arthur Kornberg (Nobel Prize 1959) and in the same department that Rothman joined a few years later. In 1976, Schekman joined the faculty of the University of California at Berkeley, where he is currently Professor in the Department of Molecular and Cell biology. Schekman is also an investigator of Howard Hughes Medical Institute.
Thomas C. Südhof was born in 1955 in Göttingen, Germany. He studied at the Georg-August-Universität in Göttingen, where he received an MD in 1982 and a Doctorate in neurochemistry the same year. In 1983, he moved to the University of Texas Southwestern Medical Center in Dallas, Texas, USA, as a postdoctoral fellow with Michael Brown and Joseph Goldstein (who shared the 1985 Nobel Prize in Physiology or Medicine). Südhof became an investigator of Howard Hughes Medical Institute in 1991 and was appointed Professor of Molecular and Cellular Physiology at Stanford University in 2008.
Nobel Prize in Literature for 2013
The Nobel Prize in Literature 2013 was awarded to Alice Munro"master of the contemporary short story".
Click Image To Enlarge
Alice Ann Munro (néeLaidlaw); was born in 1931 and is a Canadian author writing in English. Munro's work has been described as having revolutionized the architecture of short stories, especially in its tendency to move forward and backward in time. Munro's fiction is most often set in her native Huron County in southwstern Ontario. Her stories explore human complexities in an uncomplicated prose style. Munro's writing has established her as "one of our greatest contemporary writers of fiction," or, as Cynthia Ozick put it, "our Chekhov." Alice Munro was awarded the 2013 Nobel Prize in Literature for her work as "master of the modern short story", and the 2009 Man Booker International Price for her lifetime body of work, she is also a three-time winner of Canada's Governor General's Award for fiction.
Nobel Prize in Economics for 2013
The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel 2013 was awarded jointly to Eugene F. Fama (left), Lars Peter Hansen (middle) and Robert J. Shiller(right) "for their empirical analysis of asset prices".
Click Images To Enlarge
There is no way to predict the price of stocks and bonds over the next few days or weeks. But it is quite possible to foresee the broad course of these prices over longer periods, such as the next three to five years. These findings, which might seem both surprising and contradictory, were made and analyzed by this year’s Laureates, Eugene Fama, Lars Peter Hansen and Robert Shiller.
Beginning in the 1960s, Eugene Fama and several collaborators demonstrated that stock prices are extremely difficult to predict in the short run, and that new information is very quickly incorporated into prices. These findings not only had a profound impact on subsequent research but also changed market practice. The emergence of so-called index funds in stock markets all over the world is a prominent example.
If prices are nearly impossible to predict over days or weeks, then shouldn’t they be even harder to predict over several years? The answer is no, as Robert Shiller discovered in the early 1980s. He found that stock prices fluctuate much more than corporate dividends, and that the ratio of prices to dividends tends to fall when it is high, and to increase when it is low. This pattern holds not only for stocks, but also for bonds and other assets.
One approach interprets these findings in terms of the response by rational investors to uncertainty in prices. High future returns are then viewed as compensation for holding risky assets during unusually risky times. Lars Peter Hansen developed a statistical method that is particularly well suited to testing rational theories of asset pricing. Using this method, Hansen and other researchers have found that modifications of these theories go a long way toward explaining asset prices.
Another approach focuses on departures from rational investor behavior. So-called behavioral finance takes into account institutional restrictions, such as borrowing limits, which prevent smart investors from trading against any mispricing in the market.
The Laureates have laid the foundation for the current understanding of asset prices. It relies in part on fluctuations in risk and risk attitudes, and in part on behavioral biases and market frictions.
Eugene Francis "Gene" Fama (/ˈfɑːmə/) was born in 1939 and is an American economist and Nobel laureate in Economics, known for his work on portfolio theory and asset pricing, both theoretical and empirical.
He is currently Robert R. McCormick Distinguished Service Professor of Finance at the University of Chicago Booth School of Business. In 2013 it was announced that he would be awarded the Nobel Prize in Economic Sciences jointly with Robert Shiller and Lars Peter Hansen.
Lars Peter Hansen was born in `1952 and is the David Rockefeller Distinguished Service Professor of economics at the University of Chicago. Best known for his work on the Generalize Method of Moments, he is also a distinguished macroeconomist, focusing on the linkages between the financial and real sectors of the economy. In 2013, it was announced that he would be awarded the Nobel Memorial Prize in Economics, jointly with Robert J. Shiller and Eugene Fama.
Robert James "Bob" Shiller was born in 1946 and is an American economist, academic, and best-selling author. He currently serves as a Sterling Professor of Economics at Yale University and is a fellow at the Yale School of Management's International Center for Finance. Shiller has been a research associate of the National Bureau of Economic Research (NBER) since 1980, was Vice President of the American Economic Association in 2005, and President of the Eastern Economic Association for 2006-2007. He is also the co‑founder and chief economist of the investment management firm MacroMarkets LLC. Shiller is ranked among the 100 most influential economists of the world. On 14 October 2013, it was announced that Shiller, together with Eugene Fama and Lars Peter Hansen, would receive the 2013 Nobel Prize in Economics, “for their empirical analysis of asset prices”.
Nobel Prize For Peace 2013
The Nobel Peace Prize 2013 was awarded to Organisation for the Prohibition of Chemical Weapons "for its extensive efforts to eliminate chemical weapons".
The Norwegian Nobel Committee has decided that the Nobel Peace Prize for 2013 is to be awarded to the Organisation for the Prohibition of Chemical Weapons (OPCW) for its extensive efforts to eliminate chemical weapons.
During World War One, chemical weapons were used to a considerable degree. The Geneva Convention of 1925 prohibited the use, but not the production or storage, of chemical weapons. During World War Two, chemical means were employed in Hitler’s mass exterminations. Chemical weapons have subsequently been put to use on numerous occasions by both states and terrorists. In 1992-93 a convention was drawn up prohibiting also the production and storage of such weapons. It came into force in 1997. Since then the OPCW has, through inspections, destruction and by other means, sought the implementation of the convention. 189 states have acceded to the convention to date.
The conventions and the work of the OPCW have defined the use of chemical weapons as a taboo under international law. Recent events in Syria, where chemical weapons have again been put to use, have underlined the need to enhance the efforts to do away with such weapons. Some states are still not members of the OPCW. Certain states have not observed the deadline, which was April 2012, for destroying their chemical weapons. This applies especially to the USA and Russia.
Disarmament figures prominently in Alfred Nobel’s will. The Norwegian Nobel Committee has through numerous prizes underlined the need to do away with nuclear weapons. By means of the present award to the OPCW, the Committee is seeking to contribute to the elimination of chemical weapons.
COMMENTARY: Congratulations to all recipients. The 2013 Nobel laureates include six Americans. Here's a YouTube video of the Nobel Prize Ceremony:
WIRED DIRECTLY INTO THEIR NERVOUS SYSTEM, THIS REMARKABLE ROBOTIC HAND WILL SOON ALLOW ONE AMPUTEE TO ACTUALLY TOUCH AND FEEL THINGS AGAIN.
Soon, the first feeling, articulating hand will be transplanted into a living patient (Click Image To Enlarge)
About 50% of amputees don’t use their prosthesis because of relatively basic issues of design--comfort, aesthetic, and controllability. This has led inventor Dean Kamen to famously lament about humanity’s inability to offer our amputees anything better than “a hook on a stick.” Put in those terms, the lack of innovation makes your stomach churn.
It follows up research from 2009 (we believe, seen here) in which a patient was able to feel pin pricks in a tethered robotic hand. He could also wiggle its fingers (Click Image To Enlarge)
But soon, a new bionic hand made by Prensilia may change that. Through a highly experimental test surgery, in a project led by Dr. Silvestro Micera of the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, the prosthesis will be wired directly into one test patient’s nervous system, which should enable movement through thought alone, along with the ability for the patient to actually feel what touches his or her mechanical hand.
Today, the hand has been improved. It now has sensors on the palm, fingers, and wrist (Click Image To Enlarge)
The story almost sounds too amazing to be true. But the upcoming surgery is actually a follow up to a 2009 study in which a simpler, fixed model of the hand was wired into a man’s nervous system to provide a sense of touch. It only had basic sensors embedded in the palm, but the patient was able to wiggle his fingers and feel pricks of a needle. Now, the latest wave of hardware and software technology will enable the transplant of a fully articulating, bionic hand (with sensors distributed in each fingertip, the palm, and the wrist). It’s also built with an improved interface that should permit multiple feelings and gestures at once, while the 2009 hand had an extremely limited bandwidth.
That patient will wear the hand for just a month before it’s removed, and then two years later, they’ll receive a more permanent, polished version of the technology.
Plus, it has more bandwidth. So the patient should be able to both feel and move his fingers at the same time (Click Image To Enlarge)
You really can’t overstate the accomplishment at work. Hooked right into the amputee’s nervous system, this hand will be driven by thought alone (and probably a battery pack) (Click Image To Enlarge)
The human hand has always seemed like an invention that only nature could have made over the course of billions of years. Strong enough to crush an orange, deft enough to thread a needle, we’re downright lucky to be born with a pair of the most perfect tools that respond to our every whim. But it’s their ability to feel that elevates them from another tool to part of us, that enables the thousands of tiny compensations we make all day as we interact to the world with softness and force. That’s why this single invention and single surgery is so exciting--it could solve one of the ultimate human-factor issues in medicine. And better still? Researchers say if all goes well, we’ll see widespread clinical adoption of such prostheses in the not-so-distant future.
Successful or not, the hand will live on the patient for just a month, after which time the team will spend the next two years polishing a more final, potentially clinical-ready version (Click Image To Enlarge)
And for the first time in history, we may have developed a halfway decent solution for those missing our most crucial tool, the human hand (Click Image To Enlarge)
COMMENTARY: Prensilia's amazing robotic hand is the closest thing to a real hand by allowiing amputee's to actually feel what they touch. The sense of touch has been missing from artificial limbs and hands for a long time, and finally it looks like we have overcome that weakness. I don't know how much that Prensilia robotic hand will cost, but I have a feeling that it will be relatively expensive until the company can produce them at scale. Hopefully, federal gencies like Medicare and Medicaid will cover all or most of the cost. The world needs this product because it fills such a huge need among disabled amputee's.
Actor Karl Urban played the part of a young Dr. Bones McCoy in the movie "Star Trek 9."
In the future, you’ll be able to figure out what’s wrong with you (or your child) simply by scanning them with your cell phone. In the present, two companies are racing to make the first prototype.
The medical tricorder, a handheld device in the Star Trek universe used to diagnose diseases and keep track of vital signs, once seemed a sci-fi impossibility alongside teleportation and alien encounters. Not anymore. The $10 million Qualcomm Tricorder X Prize, officially announced this week, challenges entrants to create mobile platform that can accurately diagnose 15 diseases across 30 patients in three days. We caught up with two startups--Senstore and Scanadu--that think they can pull it off.
Scanadu has been working on a non-invasive, non-contact, non-sampling (no saliva, urine, stool sample necessary) tricorder since before the X Prize challenge was announced. The startup, which raised $2 million in November, was only founded last January. But co-founder Walter De Brouwer set up a research lab in Belgium--Starlab--in the late 1990s, where he prototyped a tricorder-like device. It was too far ahead of its time. Scanadu co-founder and COO Misha Chellam explains.
"It was the size of a backpack. It was an interesting idea but not really workable."
Then, in 2006, De Brouwer’s son suffered from a traumatic brain injury and was hospitalized for three months. The tricorder idea resurfaced. This time, De Brouwer, Chellam, and the rest of the nine-person Scanadu team (including two ex-NASA scientists and three bioengineers), are working on a sensor-filled medical tricorder that can be integrated into a smartphone.
The tricorder can be viewed broken into a few component pieces:
Biological sensor input (i.e. exhaling your breath to allow the chemical components to be analyzed).
Vital signs - Heart beat, blood pressure and temperature.
Imaging components (used to identify a rash, for example).
AI software that can make sense of all the inputs.
Chellam claims that a prototype will be ready by the end of 2012, and a commercial device will be ready in three years. How can Scanadu build such a futuristic concept so quickly? Much of the technology is already available or in the works--it’s mostly a matter of getting FDA approval, gaining consumer trust, and, of course, putting it all together without draining the smartphone’s battery,
Scanadu plans to first market the device to parents who want to manage their children’s health. The device--which Chellam speculates could cost around $199--could, for example, be used to detect whether an infection is bacterial or viral and monitor temperature while the user is asleep.
Despite its quick pace of development, Scanadu is still looking to collaborate. Chellam says.
"There’s a lot of innovation in this space, and we certainly don’t think we can build this thing on our own."
That brings us to Senstore, another startup that’s working on a medical tricorder--but one that will be open source. Senstore got its start atSingularity University's 2011 summer graduate program, where the current Senstore team took on the challenge of using sensor technology to solve global health problems.
The team was inspired by a Singularity University talk from Chris Anderson of DIY Drones, a community of thousands of enthusiasts working on unmanned aerial vehicles. Notes Senstore co-founder Rachel Kamar.
"It’s easy to see why people would want to build drones because playing with quadcopters is fun. We were less convinced that people would be interested in hacking our tools for health. We spent a lot of time trying to validate that."
Whereas Scanadu is building its tricorder in house, Senstore is creating a platform where people can collect sensor data and apply diagnostic algorithms. Kalmar says.
"The idea is that people closest to problems are going to have a set of tools that make it easy for them to prototype solutions."
It’s possible, then, that people will build multiple versions of the tricorder on top of Senstore’s platform--perhaps a malaria-specific tricorder or a tuberculosis tricorder.
Unlike Scanadu, Senstore probably won’t have a true tricorder prototype ready by the end of 2012. But the startup was recently accepted to the Rock Health accelerator, and in the spring, Senstore plans to launch a Kickstarter campaign to build something "a little more consumer oriented, making it easy for people to get data from wearable sensors and stream it to the cloud," explains Kamar. Senstore hopes to have sensor kits available for people to experiment with by May.
It’s all a stepping stone along the path to creating a tricorder. When a polished version is finally built, don’t be surprised if it’s a mishmash of ideas from both Scanadu and Senstore. Kalmar says.
"We would like to collaborate."
A sickbay (see below) that uses space-age technology to diagnose diseases ranging from stomach bugs to cancer has been unveiled at a British hospital.
The first of its kind, it contains a bewildering array of equipment, including probes designed for missions to Mars.
The gadgets in the million-pound unit can detect illness without the need for painful and invasive tests. They combine information about the sight, smell and ‘feel’ of a disease to produce a diagnosis.
The unit is described as the first step towards the tricorder scanners that Star Trek’s Dr McCoy waved in front of patients’ bodies to diagnose and treat illness in the crew of the Starship Enterprise.
COMMENTARY: Launched at the Consumer Electronics Show in Las Vegas, $10 million Qualcomm Tricoder X Prize is designed to challenge researches to come up with the technology or tricorder that is capable of diagnosing “key health metrics and diagnosing a set of 15 diseases” from the sick and must be light enough for a person to carry, so a maximum weight of 5lb (2.2kg) has been set.
I have a feeling that we will be seeing a variety of medical diagnostic and biofeedback apps running on today's mobile devices like smartphones and tablets carrying out the functions of a medical tricorder. Diagnostic sensors could be connected to the mobile device, and those sensors making contact with the patients skin to record vitals and make other pathogenic diagnosis. Being able to diagnose 15 diseases on one device will be the challenging part.
In Star Trek, the medical tricorder is technology that belongs to the 23rd Century. It was a device used by Dr. McCoy, Spock and his medical/science team as far back as the first episodes, giving the crew the ability to diagnose a patient by scanning his physical body.
Similar such devices already exist but not in such a small form. Prof. Jeremy Nicholson, the head of department of surgery and cancer at Imperial College London said to the BBC that the devices that exist now detect chemical signs of illness to aid patient diagnosis and claims creating one that is Trek-like tricorder size is a huge challenge and doesn’t think anyone will be able to achieve it.
Professor Nicholson said.
“The most likely sort of technology would be something that detects metabolites. What we use in our laboratory is big – the size of a Mini. The challenge is sticking it all into one device."
As a final tribute to Star Trek, here's a short video clip of actor Karl Urban as Dr. Bones McCoy in Star Trek 9.
Here's a very interesting background video of Karl Urban and how he more than measured up with DeForrest Kelly's "Dr Bones McCoy" in the original television Star Trek series. Actor Chris Pine, who played cadet James T. Kirk said that Urban's Dr. McCoy role was "spot on." I agree.
Courtesy of an article dated January 12, 2012 appearing in Fast Companyand an article dated January 12, 2012 appearing in What Cultureand an article dated September 1, 2011 appearing in Mail Online
(September 27, 2010) -- Rice University physicist Dmitri Lapotko has demonstrated that plasmonic nanobubbles, generated around gold nanoparticles with a laser pulse, can detect and destroy cancer cells in vivo by creating tiny, shiny vapor bubbles that reveal the cells and selectively explode them. The nanobubbles have been tested in theranostics with live human prostate cancer cells, without harming the animal host.
A paper in the October print edition of the journal Biomaterials details the effect of plasmonic nanobubble theranostics on zebra fish implanted with live human prostate cancer cells, demonstrating the guided ablation of cancer cells in a living organism without damaging the host. This is not the first time Rice University has used nanotechnology to advance cancer detection and destruction.
Lapotko and his colleagues developed the concept of cell theranostics to unite three important treatment stages -- diagnosis, therapy and confirmation of the therapeutic action -- into one connected procedure. The unique tunability of plasmonic nanobubbles makes the procedure possible. Their animal model, the zebra fish, is nearly transparent, suiting in-vivo research.
The key elements of plasmonic nanobubble research:
Plasmonic nanobubbles: laser pulse and plasmonic nanoparticle-generated transient event with tunable optical and mechanical properties.
Physical and optical properties of plasmon nanoparticles at high temperatures and in multi-phase environment.
Methods for imaging and characterization of plasmon nanoparticles.
Heat transfer at nano-scale.
Interaction of plasmonic nanobubbles with living cells and tissue.
Zebrafish: optically transparent organism as a model for plasmonic nanomedicine
The key elements of cell theranostics:
Cell theranostics: dynamically tuned intracellular plasmonic nanobubbles combine diagnosis (through optical scattering), therapy (through mechanical, nonthermal and selective damage of target cells) and optical guidance of the therapy into one fast process.
High-sensitive imaging and diagnosis of cells with plasmonic nanobubbles that may provide up to 102-3-fold increase in sensitivity compared to gold nanoparticles and 105-6 fold increase in sensitivity compared to fluorescent molecules.
Targeted therapy with plasmonic nanobubbles: LANTCET (laser activated nano-thermolysis as cell elimination technology). Applicastions: treatment of leukemia and of superficial tumors.
Controlled release and intracellular delivery of therapeutic and diagnostic agent into the cells.
Methods for imaging plasmonic nanoparticles in living cells and in tissue.
Micro-surgery with plasmonic nanobubbles: recanalization of occluded coronary arteries.
The National Institutes of Health has recognized the potential of Lapotko's technique by funding further research that holds tremendous potential for the theranostics of cancer and other diseases at the cellular level. Lapotko's Plasmonic Nanobubble Lab, a joint American-Belarussian laboratory for fundamental and biomedicalnanophotonics, has received a grant worth more than $1 million over the next four years to continue developing the technique.
In earlier research in Lapotko's lab in the National Academy of Sciences of Belarus, plasmonic nanobubbles demonstrated their theranostic potential. In another study on cardiovascular applications, nanobubbles were filmed blasting their way through arterial plaque. The stronger the laser pulse, the more damaging the explosion when the bubbles burst, making the technique highly tunable. The bubbles range in size from 50nm to more than 10um.
In the zebra-fish study, Lapotko and his collaborators at Rice directed antibody-tagged gold nanoparticles into the implanted cancer cells. A short laser pulse overheated the surface of the nanoparticles and evaporated a very thin volume of the surrounding medium to create small vapor bubbles that expanded and collapsed within nanoseconds; this left cells undamaged but generated a strong optical scattering signal that was bright enough to detect a single cancer cell.
A second, stronger pulse generated larger nanobubbles that exploded (mechanically ablated) the target cell without damaging surrounding tissue in the zebra fish. Scattering of the laser light by the second “killer” bubble confirmed the cellular destruction. That the process is mechanical in nature is key, Lapotko said. The nanobubbles avoid the pitfalls of chemo- or radiative therapy that can damage healthy tissue as well as tumors. "It's not a particle that kills the cancer cell, but a transient and short event," he said. "We're converting light energy into mechanical energy."
The new grant will allow Lapotko and his collaborators to study the biological effects of plasmonic nanobubbles and then combine their functions into a single sequence that would take a mere microsecond to detect and destroy a cancer cell and confirm the results. "By tuning their size dynamically, we will tune their biological action from noninvasive sensing to localized intracellular drug delivery to selective elimination of specific cells," he said.
"Being a stealth, on-demand probe with tunable function, the plasmonic nanobubble can be applied to all areas of medicine, since the nanobubble mechanism is universal and can be employed for detecting and manipulating specific molecules, or for precise microsurgery."
Lapotko's co-authors on the Biomaterials paper are Daniel Wagner, assistant professor of biochemistry and cell biology; Mary “Cindy” Farach-Carson, associate vice provost for research and professor of biochemistry and cell biology; Jason Hafner, associate professor of physics and astronomy and of chemistry; Nikki Delk, postdoctoral research associate; and Ekaterina Lukianova-Hleb, researcher in the Plasmonic Nanobubble Lab.
COMMENTARY: Nanotechnology was the rave about ten years ago, and a lot of VC's got into it, but a lot of that nanotech research went for not, and none of those startups ever turned into the next GenenTech. In fact, nanotech remains most in the laboratory, with little monetization. The claims that nanotechnology would lead to nano medicines or "nano robots" that would be used to repair tissues and kill bacteria and viruses sounded like so much science fiction. Rice University's in vivo research with plasmonic nanobubbles could make be the turning point.
When I read about this Rice University research into plasmonic nanobubbles that can attach themselves to cancer sells and kill them, it reminded me me of those mechanical tentacled drones in the Matrix movies that attached themselves to the hull of Morpheus' ship the Nebuchadnezzar, I just had to look into it further.
Current cancer treatment is highly invasive and uses hyperthermia (using high temperatures to destroy cancerous cells, also called thermotherapy), radiation or chemotherapy to destroy cancerous tissues resulting in painful and uncomfortable after-effects. Plasmonic nanobubbles kills cancer cells in a three step process called cell theranostics that combines three important treatment stages -- diagnosis, therapy and confirmation simultaneously. Present day cancer treatment involves various tests including a biopsy to determine if a patient has cancer. Cell theranostics using plasmonic nanobubbles can greatly reduce treatment times in one step.
I would use the analogy of a multiple head heat seeking guided missile that "locks in" on multiple targets simultaneously, in this case the cancerous cells, and uses plasmonic nanobubbles (the warheads) to destroy the cancer infected cells.
Plasmonic nanobubbles is still very early in the research and development stage, and more in vivo tests need to be completed, including tests on human beings, then cell theranostics using plasmonic nanobubbles needs to be approved by the FDA, before becoming an approved method for the treatment of cancer.
Courtesy of an article dated September 27, 2010 appearing in ElectroIQ
Salmonella Bacteria a part of the proteobacteria class bateria which is Gram-negative bacterria
Bacterial resistance to antibiotics is a growing problem, particularly for hospitals. Of particular concern for hospitals are Gram-negative bacteria, which have a two-layered cell wall that makes them especially resistant to existing drugs.
As a result, start-ups with potential treatments for Gram-negative infections are now hot targets for venture investors and corporate acquirers.
Among them is Rampex Pharmaceuticals, a San Diego-based startup which VentureWire reports has now raised up to $76 million in venture capital for a novel approach to fighting the problem.
Rampex Pharmaceuticals hasn’t said how its internally discovered drugs work or whether or not they fall into an existing class of antibiotics. But Chief Executive Daniel Burgess said his company’s drugs will be effective against multiple Gram-negative bacteria that now escape the effects of today’s antibacterials. Various types of drugs, such as beta lactams, aminoglycosides and fluoroquinolones, are now used against Gram-negative bacteria, but resistance is a problem for antibiotics in each class, according to Burgess.
The conditions Rampex could target include intra-abdominal infections and complicated urinary-tract infections, he said. The company said it expects to file for U.S. approval for its first product in the second half of 2012 and to seek approval to begin clinical studies of its second drug early next year, but Burgess didn’t give further details.
A 2009 Centers for Disease Control and Prevention report said the overall annual direct-medical costs of hospital-acquired infection in U.S. hospitals ranges from $28.4 billion to $33.8 billion and $35.7 billion to $45 billion, depending on the Consumer Price Index adjustments used to account for the rate of inflation in hospital resource prices.
Seeing opportunity in this problem, pharmaceutical companies are showing more interest in antibiotics after years of favoring drugs for chronic problems. Last year, for example, an anti-infectives spinout from Sanofi, Novexel, was acquired by AstraZeneca in a $505 million deal.
In late 2009, Cubist Pharmaceuticals acquired venture-funded Calixa Therapeutics to secure access to an intravenous therapy for certain Gram-negative infections in the hospital. Meanwhile, private drug companies in this field such as Achaogen and Tetraphase Pharmaceuticals have raised large venture rounds.
COMMENTARY: With annual sales of over $26 billion, antibiotics represent one of the largest therapeutic categories from a revenue perspective. Each year, almost 2 million Americans develop hospital-acquired infections such as sepsis and pneumonia, and over 95,000 of those infections resulting in patient death. The growing problem of drug-resistant bacteria will continue to drive growth in new and expanding market opportunities.
Gram-negative bacteria are bacteria that do not retain crystal violet dye in the Gram staining protocol. In a Gram stain test, a counterstain (commonly safranin) is added after the crystal violet, coloring all Gram-negative bacteria with a red or pink color. The test itself is useful in classifying two distinct types of bacteria based on the structural differences of their bacterial cell walls. Gram-positive bacteria will retain the crystal violet dye when washed in a decolorizing solution.
Cell structure of Gram-negative (left) and Gram-positive (right) bacteria
The pathogenic capability of Gram-negative bacteria is often associated with certain components of Gram-negative cell walls, in particular, the lipopolysaccharide layer (also known as LPS or endotoxin layer). In humans, LPS triggers an innate immune response characterized bycytokineproduction and immune system activation. Inflammation is a common result of cytokine (from the Greek cyto, cell and kinesis, movement) production, which can also produce host toxicity.
The following characteristics are displayed by Gram-negative bacteria:
Most of them contain Braun's lipoprotein, which serves as a link between the outer membrane and the peptidoglycan chain by a covalent bound
Most do not sporulate (Coxiella burnetii, which produces spore-like structures, is a notable exception)
Gram-negative bacterial infection refers to a disease caused by Gram-negative bacteria. One example is E. coli.
Click Image To Enlarge
It is important to recognize that this class is defined morphologically (by the presence of a bacterial outer membrane), and not histologically (by a pink appearance when stained), though the two usually coincide.
There are many groups of Gram negative bacteria such as
Green Non-Sulphur Bacteria
Proteobacteria is one of the major groups of known Gram negative bacteria and includes bacteria like:
Legionella (causes Pontiac fever and Legionnaires’ disease)
Acetic Acid Bacteria
Along with the bacteria mentioned above, there are several other types of Gram negative bacteria such as:
Haemophilus influenzae( also known as Bacillus influenzae)
Acinetobacter baumanii (which comes under Nosocomial Gram negative bacterial group).
One reason for this division is that the outer membrane is of major clinical significance: it can play a role in the reduced effectiveness of certain antibiotics,and it is the source of edotoxemia in whichendotoxin(a toxic substance associated with bacterial cell wall or core) comes in contact with bloodstreams and gets mixed with blood. Once the endotoxin is mixed in blood, it becomes very hard to stop the toxic substance from harming/destroying healthy tissues and also causing inflammation of the tissues. The substance can reach any part of the body and start to harm the tissues. The Gram negative bacteria can be killed using medication but the endotoxin is very hard to clean from the blood.
The gram status of some organisms is complex or disputed:
Mycoplasma are sometimes considered gram negative, but because of its lack of a cell wall and unusual membrane composition, it is sometimes considered separately from other gram negative bacteria.
Gardnerella is often considered gram negative,but it is classified in MeSH as both gram positive and gram negative. It has some traits of gram positive bacteria, but has a gram negative appearance. It has been described as a "gram-variable rod".
Proteus Biomedical, a biomedical technology company out of Redwood City, California, was selected as one of the World Technology Forum's Technology Pioneers for 2009 in the field of intelligent medicine.
The company develops MEMS (microfabricated, multicomponent electronic components) devices for medical applications, small enough to be attached to pills to be used as "ingestible event markers", as well as potential permanently embedded blood glucose monitoring chips.
A statement by the company:
"Proteus has developed a unique approach to personalizing therapy," said Andrew Thompson, Proteus CEO and co-founder. "We embed micro-sensors into existing drugs and devices, which transmit information, securely, to a person's cell phone via the Internet. A person can understand how their body is responding to their therapy, and, if they choose, share this information with a family member, physician or friend to help them stay healthy. We are delighted that the World Economic Forum has recognized the immense potential of this approach and look forward to actively participating in their programs."
Check out this video interview with Andrew Thompson:
COMMENTARY: That's very cool technology that can give doctors immediate feedback about how a specific drug is performing and how the patient is reacting to that drug.
In the above video, Andrew Thompson makes one of the best "pitches" that I have heard in a long time. That's what I call a knockout pitch.
Essentially Proteus Biomedical makes "intelligent medicine" that consist of therapeutic drugs and tiny biosensors on a microchip that are implanted on a pharmaceutical drug pill that communicate with a cell phone via Bluetooth.
Proteus Biomedical's intelligent medicine integrates electronics, sensors, and wireless communications into pharmaceuticals like pills. It offers a monitoring system called Raisin Personal Monitor, a wireless health monitor for remote recording and analysis of heart rate, physical activity, body position, and patient-logged events.
The company which was founded in 2001 and is based in Redwood City, California, develops these biosensors to allow cardiac resynchronization therapy to be tailored and adjusted based on a patient’s cardiovascular physiology.
The biosensors focuses on sensing, communicating, and optimizing cardiovascular performance for chronic heart failure management. It is also an intelligent pharmaceutical systems that communicates patient-specific medication-taking behavior and physiologic response.
Its products address various therapeutic applications, including cardiovascular and respiratory diseases, metabolic and central nervous system disorders, and oncology.
When ingested, any discrete event (such as the ingestion of a specific pharmaceutical) can be recorded. However, it also records any physiologic information such as heart rate, activity, body angle and patient-logged information. The unique ingestion event and all logged information are then communicated via Bluetooth to any computerized device, such as mobile phone applications. The system is being developed as part of an integrated intelligent medicine system to track response and outcomes-based of treatments. Proteus partners are currently developing these products to treat diabetes, cardiovascular disease, psychiatric disorders, organ transplantation and infectious disease.
Stroke patients striving to walk normally may get a lift from a bionic leg developed by venture-backed Tibion Corp.
Many stroke survivors have weakness on one side that impairs their gait. Tibion’s device, worn around the leg during physical therapy, supplies the power needed to stand or walk, and may enable more intense therapy sessions. As strength and control returns, it assists the leg less.
Tibion, which has placed the device in 10 U.S. rehabilitation centers since launching it in January, soon intends to close a $15 million Series B round to expand the rollout, said Chief Executive Charles Remsberg. Previous backers include Claremont Creek Ventures and Saratoga Ventures.
The company’s product, the first of its kind, targets a large problem. Stroke, in which blood stops flowing to the brain, affects 795,000 Americans a year, according to the American Heart Association. A leading cause of long-term disability, stroke costs the U.S. about $70 billion annually in direct and indirect expenses, including health-care services, medications and missed work.
Early research suggests that Tibion’s device could reduce patients’ dependence on health services by helping them recover their ability to walk, drive and live independently. A study at New York Presbyterian Hospital could supply more hard evidence. The trial is comparing 12 patients receiving conventional therapy to 12 rehabbing with the bionic leg’s added boost, according to Remsberg, who said he is talking with universities and the U.S. Department of Veterans Affairs about additional studies.
The company’s product, which wraps around the leg and extends from the thigh to the ankle, amplifies a patient’s effort, adding just enough mechanical lift to enable him to stand or take a step. Sensors in the patient’s shoe detect weight distribution and trigger the robot to deliver the amount of assistance that a therapist has programmed in.
Stroke survivors with “hemiparesis,” or muscle weakness on one side of the body, initially have little movement in one leg, so therapists traditionally have done most of the work to get the limb moving. Tibion aims to shift the effort to the patient. “We’re making it possible for the patient to work more intensively with the affected leg,” Remsberg said.
On a recent morning at Whittier Rehabilitation Hospital in Bradford, Mass., patients performed sit-to-stand drills and other exercises with the robot’s help. One patient, a 63-year-old man, said he needed help to get out of bed shortly after his stroke, but has improved with therapy.
“I’m walking straight,” he said. “I’m not dragging my foot [any] more.”
How much the bionic leg contributed to his recovery is not entirely clear, but scientists suspect the device helps stroke survivors to form new neural connections to compensate for the ones they lost. By enabling them to work their leg more, the robot appears to help magnify afferent, neural signaling that reawakens the brain to the limb’s presence, Remsberg said.
Tibion’s has developed the first ambulatory device that can rehabilitate stroke patients with gait impairment. And the stroke market is very, very large.
In 2006, 890,000 Americans were discharged with a diagnosis of stroke – 606,000 in people over the age of 65, and 236,000 between the ages of 45 and 64. Average length of stay (LOS) ranged from 4.8 to 5.1 days.
In 2005, approximately 5,839,000 Americans had a history of stroke, about 50% of whom suffer gait disturbances that limit their ambulation and place them at sharply increased risk of falls.
Escalating rates of obesity and Type II diabetes have led many researchers to forecast a “stroke tsunami” in the coming decade.
Only the Tibion Bionic Leg offers millions of chronic stroke survivors the potential to regain lost mobility – and a reason to return to therapy.
Rehabilitation hospitals. Tibion believes that approximately 2000 freestanding U.S. rehabilitation hospitals represent the greatest opportunity for its commercialization efforts.
Rehabilitation hospitals certified as stroke centers…
Often admit acute-stroke patients directly from ambulance transport, when reimbursement can equal $15,000/day.
Enjoy the highest reimbursement for post-acute rehabilitation, ranging from $1500-4000/day
Justify their higher reimbursement and retain patients for post-acute inpatient rehab by acquiring and promoting the latest rehab technology.
Enjoy a volume of managed care and Medicare stroke inpatients and outpatients that justify purchase of multiple Bionic Legs.
Represent excellent prospects for encouraging patients to rent a planned home Bionic Leg.
Skilled nursing facilities. Roughly 15,000 skilled nursing facilities (SNFs) often compete with rehab hospitals for residential stroke rehabilitation – if they have technology like the Bionic Leg.
When stroke patients leave an acute-care hospital for residential rehab, patient families often receive several qualified SNF referrals from a discharge planner or case manager – and the family must visit each and select a provider.
SNFs often use technology as a competitive tool to persuade families to bring patients to their facilities.
Medicare Part A fully pays for 20 days of residential rehabilitation at an SNF, after which patients can elect to pay a $137/day copayment for up to 100 additional days.
SNF technology not available at local outpatient rehab clinics can persuade SNF patients pay to remain for many days beyond the 20 Medicare-paid days.
Other candidate Bionic Leg customers include 715 accredited U.S. stroke centers, 153 Veterans Administration hospitals with rehab facilities, and 895 smaller VA rehab clinics.
Tibion is now testing a new “pay-as-you go” billing system that holds the potential to make the Bionic Leg available to an even wider range of facilities, including thousands of small private clinics with only a few stroke patients a week, and even home care PTs.
Tibion is aware of no comparable, wearable robotic stroke rehab devices marketed or reported in development anywhere in the world.
The closest robotic device in use for stroke gait therapy is the Hocoma Lokomat, a $300,000 body-weight support system limited to treadmill use. Several exoskeleton devices – essentially, “vertical wheelchairs” – have been developed for spinal cord injured patients and for certain military applications.
The Bionic Leg is protected by a battery of patents and proprietary software technology.
In addition to planned enhancements and a home version of the Bionic Leg, Tibion anticipates development of upper extremity robotics that will consolidate its position as a robotic rehabilitation leader.
I have seen stroke victim's undergo lengthy and very painful rehabilitation. My mother had a stroke, and it was very painful for her just to bend her leg, let alone walk. Thankfully, she was able to walk again.
Tiobion's bionic assisted-walking device definitely fills a serious need in the healthcare industry, not just for stroke victim's, but for post-operative knee joint patients.
Scientists are on the brink of radically expanding the span of a healthy life. Author Sonia Arrison on the latest advances—and what they mean for human existence.
In Jonathan Swift's "Gulliver's Travels," Gulliver encounters a small group of immortals, the struldbrugs. "Those excellent struldbrugs," exclaims Gulliver, "who, being born exempt from that universal calamity of human nature, have their minds free and disengaged, without the weight and depression of spirits caused by the continual apprehensions of death!"
But the fate of these immortals wasn't so simple, as Swift goes on to report. They were still subject to aging and disease, so that by 80, they were "opinionative, peevish, covetous, morose, vain, talkative," as well as "incapable of friendship, and dead to all natural affection, which never descended below their grandchildren." At 90, they lost their teeth and hair and couldn't carry on conversations.
Sonia Arrison, author of a new book on longevity, explains how scientific advances are making radical life expansion -- to age 150 and beyond -- a possibility, and what it could mean for human existence.
For as long as human beings have searched for the fountain of youth, they have also feared the consequences of extended life. Today we are on the cusp of a revolution that may finally resolve that tension: Advances in medicine and biotechnology will radically increase not just our life spans but also, crucially, our health spans.
The number of people living to advanced old age is already on the rise.
5.7 million Americans are 85 years of age and older, amounting to about 1.8% of the population, according to the Census Bureau.
By 2050, 19 million more Americans , or 4.34% of the population, will be added to those 85 years and older, based on current trends.
The percentage of Americans 100 and older is projected to rise from 0.03% today to 0.14% of the population in 2050. That's a total of 601,000 centenarians.
But many scientists think that this is just the beginning; they are working furiously to make it possible for human beings to achieve Methuselah-like life spans. They are studying the aging process itself and experimenting with ways to slow it down by way of diet, drugs and genetic therapy. They are also working on new ways to replace worn-out organs—and even to help the body to rebuild itself. The gerontologist and scientific provocateur Aubrey de Grey claims that the first humans to live for 1,000 years may already have been born.
The idea of "conquering" aging has raised hopes, but it has also spurred a debate about whether people should actually aspire to live that long. What does a longer-living population mean for relationships and families? How can we afford to support massive numbers of aging citizens, and how can individuals afford to support themselves? Won't a society of centenarians just be miserable, tired and cranky?
A 2009 study found that restricting calories seems to slow aging in rhesus monkeys over a 20-year period. Both of the monkeys above are pictured at 27 years old. The one on the left (A, B) ate a regular diet. The more robust-looking monkey on the right (C, D) was fed a restricted diet with 30% fewer calories than usual.
The scientists working on these issues respond to such concerns by stressing that their aim is not just to increase the quantity of life but its quality as well. A life span of 1,000 may be optimistic, they suggest, but an average span of 150 years seems well within reach in the near future, with most of those years being vital and productive.
GENE THERAPY TO EXTEND NORMAL LIFE SPANS
One key area of research is gene therapy. Cynthia Kenyon of the University of California, San Francisco, found that partially disabling a single gene, called daf-2, doubled the life of tiny worms called Caenorhabditis elegans. Altering the daf-16 gene and other cells added to the effect, allowing the worms to survive in a healthy state six times longer than their normal life span. In human terms, they would be the equivalent of healthy, active 500-year-olds.
Experiments with animals are not always applicable to humans, of course, but humans do have the same sort of genetic pathways that Dr. Kenyon manipulated. Other researchers have made similar findings. A laboratory at the University of Arkansas genetically altered worms to live 10 times longer than normal. Spain's National Cancer Research Center found an altogether different way to extend the lives of mice by 45%.
REPAIRING AND REPLACING WORN-OUT BODY PARTS
Tissue and Organ Regeneration - The Wake Forest Institute for Regenerative Medicine, led by Anthony Atala, has successfully grown bladders in a lab and implanted them in children and teenagers suffering from a congenital birth defect. The basic structure of the bladders was built using biodegradable materials and was then populated with stem cells from the patients, so that their bodies wouldn't reject the transplant. It worked. Today the institute is working to grow more than 30 different organs and tissues, including livers, bone and hearts. With heart disease the No. 1 killer in the U.S., building a human heart will be a major step forward. Doris Taylor announced in 2008 that her cardiovascular lab at the University of Minnesota had managed to grow a rat heart using a technique similar to Dr. Atala's, except that the structure she used was from a donor rat. Dr. Taylor is currently repeating the experiment on pigs.
Tissue-On-Demand - Another promising new technology is organ printing, which is exactly what it sounds like: Cells, rather than ink, are put into a sophisticated 3-D printer and then printed onto a biodegradable material. The machine prints "pages" of cells on top of each other to make a three-dimensional shape. In December 2010, a company called Organovo announced that it had successfully printed human blood vessels—an important feature of all organs.
Extracellular Matrix (ECM) - At the McGowan Institute for Regenerative Medicine at the University of Pittsburgh, Stephen Badylak is working with "extracellular matrix" (ECM)—the material that gives structure to tissue—from pig bladders. Dr. Badylak has used ECM to grow back the tips of patients' fingers that have been accidentally snipped off, and his colleagues have used it to cure early-stage esophageal cancer by removing the cancerous cells and replacing them with ECM. Scientists don't understand why the substance promotes new tissue growth, and ECM can't yet grow back entire limbs, but results are impressive.
Assuming that the necessary technology eventually arrives, the big question is: What will life look like when we live to over 100?
Extending Life Span in the Lab
Fruit Flies: 100%
*Study still ongoing, Source: Sonia Arrison
Centenarians in the U.S.
*Based on current trends, Source: U.S. Census Bureau
One of the most important areas of potential change is family and relationships. With an average life expectancy of 150 years, it's possible that we might see age differences of as much as 80 or 90 years between spouses and partners. But the historical evidence suggests that such disparities in age probably won't be common.
Research by Norway's government statistics bureau shows that between 1906 and 2002, life expectancy rose from around 57 years to around 79 years in that country. But the average age difference in relationships remained at around 3.5 years (men being slightly older).
One reason for the rarity of relationships with large age gaps is that modern societies tend to look down on them. Will the number of men marrying much younger women continue to grow as people live longer and such relationships become less stigmatized?
Research done at Stanford, the University of California, Santa Barbara, and the University of Wisconsin suggests that older men seek younger partners primarily to continue having children. If that is the case, such men won't need to find younger partners once it is easier for older women to have their own biological children using new fertility technologies.
And in the future, older women (and men) will likely look less "aged" because they will remain healthy for much longer. Remarriage for beauty or youth will lose some of its distinguishing force.
More time to live also raises the possibility of more divorces and remarriages—the seven-year itch turned into the 70-year itch. Today, some people get married two or even three times, but as people live longer, these numbers could increase, perhaps exceeding Liz Taylor proportions for at least a small slice of the population. But greater longevity might also lead to a higher incidence of serial monogamy, regardless of whether it leads to marriage, perhaps interspersed with periods of living alone.
As researchers further refine reproductive technology like egg freezing and ovary transplants, the ranks of older parents, currently on the rise, are bound to increase even more. This raises the prospect of families in which siblings are born many decades apart, perhaps 50 years or more. How would such age gaps between children change family dynamics?
We know that siblings of the same age cohort have more meaningful and longer-lasting relationships than those separated by more years, but it is difficult to predict how the relationship between siblings born decades apart would function. It probably would be akin to that of a child and an aunt or uncle, or even a child and a grandparent.
Living longer would also mean both making and spending money longer. What would an economy look like in which work lives extended into a second century of potential productivity?
Most of us already don't expect to retire at 65. The Social Security system cannot afford it even now, and in the future, going out to pasture at 65 will mean decades of boredom. People who live to 150 will use their additional years for second and third careers, and we are likely to see a greater movement toward part-time and flex-time work.
It has long been clear that wealth creates health. We now know that health also begets wealth. In a paper titled "The Health and Wealth of Nations," Harvard economist David Bloom and Queen's University economist David Canning explain that, based on the available research, if there are "two countries that are identical in all respects, except that one has a five-year advantage in life expectancy," then the "real income per capita in the healthier country will grow 0.3–0.5% per year faster than in its less healthy counterpart."
Although these percentages might look small, they are actually quite significant, especially when we consider that between 1965 and 1990 countries experienced an average per capita income growth of 2% per year.
Those numbers are based on only a five-year longevity advantage. What if a country had a 10-, 20-, or 30-year advantage? The growth might not continue to rise in linear fashion, but if the general rule holds—a jump in life expectancy causes an increase in economic growth per capita—then having a longer-lived population would generate enormous differences in economic prosperity.
In a 2006 study, the University of Chicago economists Kevin Murphy and Robert Topel painstakingly calculated that for Americans, "gains in life expectancy over the century were worth over $1.2 million per person to the current population." They also found that "from 1970 to 2000, gains in life expectancy added about $3.2 trillion per year to national wealth."
The world's advanced societies are finally in a position to launch a true offensive against the seemingly irresistible terms imposed on our lives by disease and death. That's good news for us as individuals and for humanity as a whole. A longer span of healthy years will lead to greater wealth and prospects for happiness.
But realizing the full potential of the longevity revolution will not be easy. We will need to tackle important and legitimate questions about the effects of greater health spans on population growth, resource availability and the environment. The decisions that we make in this regard will matter far more than the mere fact of greater numbers.
The very idea of radically greater longevity has its critics, on the right and the left. Leon Kass, who served as chairman of the President's Council on Bioethics under George W. Bush, sees the scientific effort to extend life as an instance of our hubris, an assault on human nature itself.
The environmental writer Bill McKibben, for his part, strongly opposes what he calls "techno-longevity," arguing that "like everything before us, we will rot our way back into the woof and warp of the planet."
I'm unconvinced. Arguments against life extension are often simply an appeal to the status quo. If humans were to live longer, we are told, the world, in some way, would not be right: It would no longer be noble, beautiful or exciting.
But what is noble, beautiful and exciting about deterioration and decline? What is morally suspect about ameliorating human suffering?
The answer is nothing. Everything that we have, socially and as individuals, is based on the richness of life. There can be no more basic obligation than to help ourselves and future generations to enjoy longer, healthier spans on the Earth that we share.
COMMENTARY: I am absolutely amazed at the potential for tissue engineering or regeneration technology to grow new organs and tissues. Those organ printers are really something else. Tissue-on-demand. Patients requiring an organ transplant would no longer have to wait in line for someone to die to obtain a new organ. Many of them now die because there are not enough new organs for those needing them. Organ rejection could be a thing of the past, because tissues taken from the patient's failing organ can be used to regeneratea new organ.
The ability to replace every aging organ in the human body is now within reach, perhaps another decade or two. However, was man meant to live 150 years and still be happy? You cannot replace 150 years of culture, and expect someone to adapt to new cultures, accept new music, fashion, and so forth.
Get ready for a psychological shockwave. Just how much will people be willing to pay to live to be 150 or maybe 200 years of age. Regenerating new tissues and organs will not come cheap. Only the very wealthy will afford this service. The Six Million Dollar Man will no longer be science fiction, but the real thing. Aging actors and actresses will stay "young" literally forever. Imagine if Marilyn Monroe or Elizabeth Taylor lived today, and looked like when they were in their early 20's.
I could not bear the thought that some 150 year old male, who should've been dead decades ago, still looks 30 years old, and have a harem of chicks. Imagine the conversation, "Hey Bush, how things going with you?" He would respond, "Me and Barb are taking a second honey moon, I'm back to drinking and smoking again. Life is good". Bush would ask me, "Hows things going with you Turk?" I would respond, "Say-mo, Say-mo, I'm an old fart now. Retirement home, you know."
The technology for tissue and organ regeneration are still in the laboratory. Complex tissues like the heart, brain and spinal column will require more time to develop. The idea of regenerating or printing tissues on a printer layer upon layer, is quite incredible. It is quite possible that tissue and organ regeneration could be commonplace in another decade or two. Just think of the potential and societal consequences.
We are telling some, "You will age and die", and saying to those that can afford these new technologies to extend life, "You get to live to be 100. Now write me a check."
Maybe Replicants are the answer, as in the film, "Blade Runner".
When I say don't put your iphone next to your head it's not my opinion. It's actually stated in the iphone manual.
My partner Rob Schuham recently sent me a blog post about the RF radiation and it mentioned that Apple, in the iphone manual, suggests that you keep the phone 5/8" away from your body. I wasn't sure I believed it, this being the web and all. So we pulled out our iphone manual here at the Cottage, and sure enough there it was.
Now we have all heard about the potential harmful effects of the radiation coming off our cell phone antennae, but most of us have probably ignored it or just figured it was some sort of technological paranoia. One way or another, I've always laughed it off.
But someting about this warning in their manual reminded me of the work I did all of those years on Truth and all of the research into the tobacco companies that came with it.
Interestingly enough, the first to begin to suspect that cigarettes were killing people were the tobacco companies themselves. After all, they are most intimate with their product. And their reaction, to avoid liability, was to deny that there was a problem. When there was no way to deny the problem any longer, they shifted to the stance that, "it is a personal decision." Thus moving liability to the user. Well, those tactics were very successful for a very long time, and in some ways you could say that they are still working.
As I gazed on the warning within the iphone brochure, I couldn't help but imagine the swirling emails and conversations among the legal team as they crafted this language. As any of you who have worked to get a manual out the door in time knows, this language has been considered very carefully and it has gone through many revisions. And the only reason it would make it into the final document is that the legal minds thought there was a liability issue that they were mitigating. Mitigating risk is their job and it is the job of this document. By "mitigating risk," unfortunately, I don't mean the risk to us the users, I mean the risk of financial liability for the company.
This gets my attention. They know something, and it makes me especially concerned for the safety of my children whose brains are said to be more susceptible to environmental assaults.
Now what to do with our iphones? Maybe there is a product opportunity here for a case that prevents you from getting the device too close to your head. There goes the slim look - 5/8" is a lot when you think about it.
You would need to hold your phone about this far away from your head!
Who does that? Has anyone heard that you should do that? It's actually quite weird if you try it.
Obviously, talking on speaker or with an earpiece makes sense as a precaution. But the radiation is not present just when you're on a call. It can be strongest when the phone is in your pocket and you're not on it. My phone just officially moved out of my pocket into my messenger bag.
Something tells me we are at the beginning of this story not the end.
Here is the copy from the Apple iphone manual:
For optimal mobile device performance and to be sure that humanexposure to RF energy does not exceed the FCC, IC, and Eropean Union guidelines, always follow these instructions and precautions: When on a call using the built-in audio receiver in iPhone, hold iPhone with the dockconnector pointed down toward your shoulder to increase separation from the antenna. When using iPhone near your body for voice calls or for wireless data transmission over a cellular network, keep iPhone at least 15 mm (5/8 inch) away from the body, and only use carrying cases, belt clips, or holders that do not have metal parts and that maintain at least 15 mm (5/8 inch) separation between iPhone and the body. iPhone is designed and manufactured to comply with the limits for exposure to RF energy set by the Federal Communications Commission (FCC) of the United States, Industry Canada (IC) of Canada, and regulating entities of Japan, the European Union, and other countries. The exposure standard employs a unit of measurement known as the specific absorption rate, or SAR. The SAR limit applicable to iPhone set by the FCC is 1.6 watts per kilogram (W/kg), 1.6 W/kg by Industry Canada, and 2.0 W/kg by the Council of the European Union. Tests for SAR are conducted using standard operating positions (i.e., at the ear and worn on the body) specified by these agencies, with iPhone transmitting at its highest certified power level in all tested frequency bands. Although SAR is determined at the highest certified power level in each frequency band, the actual SAR level of iPhone while in operation can be well below the maximum value because iPhone adjusts its cellular transmitting power based in part on proximity to the wireless network. In general, the closer you are to a cellular base station, the lower the cellular transmitting power level.
iPhone’s SAR measurement may exceed the FCC exposure guidelines for body-worn operation if positioned less than 15 mm (5/8 inch) from the body (e.g., when carrying iPhone in your pocket).
I just looked at the User's Guide for my BlackBerry Tour, which by the way is the only real business-oriented cellphone in the marketplace (inserted for effect), and on page 3, titled "Important safety precautionsd", I found a similar language
When you wear the BlackBerry device close to your body, use a RIM approved holster withan integrated belt clip or maintain a distance of 0.98 in. (25 mm) between your BlackBerry device and your body while the BlackBerry device is transmitting. Use of bodyworn accessories, other than RIM approved holsters with an integrated belt clip, might cause your BlackBerry device to exceed radio frequency (RF) exposure standards if the accessoriesare worn on your body while the BlackBerry device is transmitting. The long term effects of exceeding RF exposure standards might present a risk of serious harm. For more informationabout the compliance of this BlackBerry device with the FCC RF emission guidelines, visitwww.fcc.gov/oet/ea/fccid and search for the FCC ID for your device as listed below: BlackBerry® Tour™ 9630 smartphone: FCC ID L6ARCF70CW
If you read my previous article about the electromagnetic waves cell phones emit, you will discover that they are inherently DANGEROUS as you can see from the following x-ray photo that shows how deep those waves penetrate into your head:
No need to worry though, Pong Research has developed a cell phone radiation guard and their site has a very alarming video of cell phone electro-magnetic radio waves before and after using the guard.