Available Technologies

Find technologies available for licensing from UC Santa Cruz.

No technologies match these criteria.
Schedule UC TechAlerts to receive an email when technologies are published that match this search. Click on the Save Search link above

Interference Management for Concurrent Transmission in Downlink Wireless Communications

It is well known that the communication capacity of wireless networks is limited by interference. Depending on the strength of the interference, there are three conventional approaches to this problem. If the interference is very strong, then the receiver can decode the interfering signal and subtract from the desired signal using successive interference cancelation. If the interference signal is very weak compared to the desired signal, it can be treated as noise. The third and most common possibility is when the interference is comparable with the desired signal. In this case the interference can be avoided by orthogonalizing it with the desired signal using techniques such as time division multiple access (TDMA) or frequency division multiple access (FDMA). In addition to interference, wireless networks also experience channel fading. Conventional approaches to wireless networking attempt to combat fading. Depending on the coherence time of the fading, various approaches have been used. For example, fast fading may be mitigated by the use of diversity techniques, interleaving, and error-correcting codes. Certain diversity techniques, such as the use of multiple antennas, has been shown to help combat fading as well as increase multiplexing gain and system capacity. Multiuser diversity scheme is a technique to increase the capacity of wireless networks using multiple antennas at the base station. In this approach the base station selects a mobile device that has the best channel condition, maximizing the signal-to-noise ratio (SNR). According to some implementations of this approach, K random beams are constructed and information is transmitted to the users with the highest signal-to-noise plus interference ratio (SINR). Searching for the best SINR in the network, however, requires feedback from the mobile devices that scales linearly with the number of users. These implementations also use beamforming, which is complex to implement. In addition, the cooperation requirement is substantial.

Simplified Workflow For Hybridoma Antibody Sequencing

Before recombinant antibody expression plasmids can be designed, sequncing of the antibody light and heavy chain variable regions is necessary. Several other methods of sequencing antibody variable regions are available. Some involve high throughput RNA sequencing. These techniques are unavailable to many labs; they require the preparation of RNA-seq libraries, and computational analysis. As a result, the cost of performing such techniques is substantial and with sequencing cores being oversubscribed, turnaround can be as long as weeks to months. Other methods involve PCR and Sanger sequencing. However PCR amplification of variable regions results from difficulties in generating universal primers that can amplify any given variable region - particularly given the inherent low sequence identity in the 5' leader sequence of antibody light chains and heavy chains upstream of the variable regions. Sometimes degenerate primers can be used, but amplification success rate is only 80-90% due to non-specific priming and/or failure to prime at all. In addition, there is a significant risk that the variable regions of the parental myeloma line can amplify using the degenerate primers. 5' RACE (rapid amplification of 5' cDNA ends) can also be used, but mRNA degradation, cDNA purification and poly-A addition between reverse transcription and PCR, makes the technique long and difficult to perform. Non degenerate primers can be used, but each variable region requires multiple amplification attempts with different primer sets as well as sequence validation using mass spectrometry. And with both of these methods, primer derived mutations can be introduced. Mass spectrometry can be used to determine antibody variable regions, but these can result in ambiguous sequences because of isobaric residues such as isoleucine and leucine. But this method is time consuming, requires huge amounts of purified monoclonal antibody, is expensive and is inaccessible to most researchers. This technology involves a template switch reverse transcription of hybridoma RNA with at least three chain specific RT primers - one for the kappa chain, one for the lambda chain, and at least one for the heavy chain (for efficiency, this can be limited to IgG in a first pass). These are amplified in three separate PCR reactions and sequenced using Sanger sequencing.

Ligament-Based Elastic Hybrid Soft-Rigid Joints

The combination of elasticity and rigidity found within mammalian limbs enables dexterous manipulation, agile, and versatile behavior, yet most modern robots are either primarily soft or rigid. Most mammals have ligaments that connect bone to bone, enabling joints to passively redirect forces and softly constrain the range of motion. Hybrid robots, composed of both soft and rigid parts, promote compliance to external forces while maintaining strength and stability provided by rigid robots.Natural manipulators, such as the human arm, have been shaped by the long-term optimization of evolution. They tend to be extremely versatile, having the dexterity to work with various objects and environments. The hybrid composition of rigid and soft components– including bones, muscles, and connective tissues–yield inherent compliance and flexibility. Biological joints often have passive stability and elasticity that create mechanical feedback that benefits disturbance responses. In addition, recent progress in the mechanical complexity of robots has popularized embedding intelligence within the system. Such robots may inherently dampen motion through elastic components or enable complicated movements emerging from simple actuation, e.g., origami robots.In contrast, traditional robot arms tend to feature rigid components that are susceptible to large moments propagating throughout the entire robot. This means the robot’s structural integrity can be compromised by a large unpredictable disturbance. Robotic manipulators involving rigid joints have strictly defined degrees of freedom resulting from the mechanical design. These joints typically fit within three categories: prismatic joints (linear movement on an axis), revolute joints (rotational movement around an axis), or a combination of the two.Rigid robotic joints are often actuated by motors that change their position directly, allowing straightforward kinematic models to calculate the joint's position. Due to their dynamics, traditional feedback control systems, such as a proportional integral derivative (PID) controller, can solve this problem relatively well, with modifications that can adapt to the influences of gravity. One modification to account for nonlinearities involves feed-forward neural networks with PID input features.While these dynamics are effective within controlled environments, rigid robots pose dangers to both themselves and humans because of their intrinsic inability to deal with external forces. In environments where humans are directly interfacing with robots–such as industrial manufacturing or telepresence -- the robot’s lack of compliance can put workers and civilians at risk of injury. Measures have been taken to increase the safety of these robots, but they are not innately safe. Factors including intrinsic safety, human detection, and control techniques influence the overall safety. The risk that a robot will cause physical harm has also been shown to moderate people’s willingness to work with the robot. Strategies such as safety fences and human-detection increase safety but limit the human-robot collaboration.Flexible robots can mitigate these external forces through structural compliance while maintaining morphological similarities with rigid robots. Systems such as soft robots and tensegrity-based robots with elastic components are inherently compliant. Biologically inspired approaches tend to exemplify this behavior. The motion of legged tensegrity structures has been validated by biological simulations while simplifying the underlying bone-ligament architecture. From a bio-mimicry perspective, a human finger has been functionally recreated through oneshot three-dimensional (3D) printing techniques employing both rigid and elastic components. Soft robots can provide safe human interaction, resulting in safer environments. Soft cable driven exo-suits can be compliant while avoiding obstruction to the user’s range of motion. Intelligent design approaches have even resulted in programmable tensegrities. However, due to the non-linearities within the elastic components, these compliant robots tend to require complex models in order to be controlled properly.Soft robotics made from elastic components increase compliance while often sacrificing stability and precision. An accurate model of the system would enable the use of modern control techniques which can provide optimal solutions. Optimal control finds the proper control values which optimize an objective function based on the system model. A Linear-Quadratic Regulator (LQR) solves the problem of minimizing a quadratic cost matrix (encoding weights of errors, energy use, etc.) over a specified time horizon, however it is expensive and demands accurate models. Model predictive control optimizes a finite time-horizon window that is repeatedly solved at each new time-step, reducing computational cost while enabling anticipation of future events. To create the model, a common method involves system identification, which can estimate the dynamics based on measurements.However, noise in the design process can breed inconsistencies in production, and the non-linear nature of flexible robots further complicates modeling. This emphasizes the need for control methods that can learn from data. One potential solution for controlling this variation in robots involves having precisely adjusted models for each physical instance. But these approaches are cumbersome due to the requirement of constructing precise models. Thus, there is a need for a system and method of controlling soft-rigid hybrid robotic joints that overcome the deficiencies of the conventional control methodologies. 

"Incubator-Free" Vessels For Cell Culture Which Do Not Use An Air Intermediate For Gas Regulation

Cell culture plates are an essential tool for cell biology research. They are used to grow cells in a controlled environment, which allows for study of the effects of different conditions on cell growth and development. The plates are typically made of plastic or glass and may have one or more wells, each of which can hold a small amount of cell culture media. The media provides the cells with the nutrients the cells need to grow and divide. Cell culture plates may be used in incubators to grow cells in a controlled environment as well as in glove boxes. The incubator provides the cells with the necessary conditions for growth, including a constant temperature, humidity, and atmosphere.Conventional cell culture plates are susceptible to evaporation, which causes increased osmolarity of cell culture media. This in turn causes unnatural growth of cells and well-to-well variability due to uneven evaporation. In addition, evaporation causes increased concentration of the salts involved in electrical signaling of electrically active cell types, changing the ionic gradients across the cell membrane, and affecting all characteristics of the initiation, transmission (and computation) in electrically active cells such as cardiac or neuronal cells. It is also difficult to maintain desired dissolved gas concentration with standard cell culture plates. This generally requires use of a compressed gas system, which uses gas regulators, sensors that are expensive and have limited lifetimes, and feedback control as well as a glove box for culture and/or handling.An incubator is used to maintain the desired temperature of the cell culture plates. The incubator impedes access to the cultures for feeding, for microscopy, etc. Furthermore, observation equipment for use inside an incubator needs to be designed to resist incubator conditions (e.g., body temperature heat and humidity). Incubators also take up significant space and packing of incubators in a laboratory is space-inefficient relative to the form factor of the cell culture plates. As the number of cell culture plates in a single incubator increases, the ability of the incubator to perform its function decreases, since there is a minimum number of times an incubator may be accessed per week per cell culture vessel. However, every time the incubator is accessed, it is unable to perform its functions for a prolonged period of time, e.g., over 30 minutes. Another technical problem is that cell culture devices that use an air gap for gas exchange have an increased risk of microbial contamination via that air gap. This makes it difficult to perform manual cell culture experiments over the course of months without contamination. In addition, cross-contamination is more likely if multiple different experiments are being performed in the same laboratory. Thus, there is a need for cell culture vessels and systems that overcome these problems.

Semi-Automated Insect Culturing Device

Drosophila spp., also known as fruit flies, are widely used in genetic research. Drosophila lines (e.g. flies with a particular mutation) can only be stored as live animals – they cannot be frozen and still remain viable. So to maintain the stocks, the live flies are manually transferred from an old vial to a new vial on a regular basis (every 1-2 weeks). Some Drosophila labs maintain hundreds or even thousands of individual lines and so maintenance of these lines can be very time consuming. A UC Santa Cruz Drosophila researcher has developed a simpler and more efficient method of transferring the flies that requires significantly less hands-on work.An earlier version of this invention has been patented and patent prosecution continues. However, additional improvements to hands-free Drosophila maintenance systems were still necessary. In particular, a device that could be fabricated by injection molding would be advantageous as would a device that could better facilitating labels of stocks and that can be more readily separated into individual components for shipping and study. 

Biodiesel Made Easy

Conventional biodiesel production methods from vegetable oils come with significant drawbacks, which include unwanted soap production, low yields and/or difficulty in purification. Currently, homogenous base catalysts have received the most attention for biodiesel production due to their availability and low price. Common base catalysts utilize hydroxide in the form of sodium hydroxide (NaOH) or potassium hydroxide (KOH) for the transesterification of vegetable oils in methanol (MeOH). The role of hydroxyl group (-OH) is to deprotonate the mildly acidic proton of MeOH, forming [-OMe] ions for the transesterification reaction. Both NaOH and KOH show excellent catalytic activity towards biodiesel production but come with the major drawback of producing water. This reduces biodiesel yield and adds complications for purification of the desired biodiesel from the alkali-metal fatty acid carboxylate (i.e. soap) as a side product Sodium methoxide (NaOMe) and potassium methoxide (KOMe) have also been used in biodiesel production since they are a direct source of [-OMe] ions. However, these reagents result in a more complex separation of biodiesel from the byproduct. Mostly notable, fatty acid methyl esters (also called FAMEs) containing double bonds are not suitable for use diesel engines since the alkenes react with hydroxyl radicals present during combustion. These radicals polymerize diesel fuel during combustion, resulting in premature aging of the engine. Thus, a new approach to the transesterification of vegetable oils is desired.

Producing aluminum oxide (alumina) from reaction of a gallium/aluminum alloy with water

UC Santa Cruz investigators initially made a breakthrough discovery by which a gallium-rich alloy of gallium and aluminum containing aluminum nanoparticles that could be formed at relatively low temperatures (between 20 and 40 degrees C) could liberate nearly theoretical quantities of hydrogen in effectively any source of water (NCD 32779) through a chemical reaction requiring no outside electrical input and no corrosive byproducts. One of the eventual useful byproducts of this reaction is alumina (aluminum oxide, Al2O3) a commodity chemical with a wide variety of uses in industry. This technology describes ways of further refining aluminum oxide from the products of this reaction. 

Design Of Functional Protein Materials Based on Beta-Rippled Sheet Architectures

The rippled sheet was proposed by Pauling and Corey as a structural class in 1953. Following approximately a half century of only minimal activity in the field, the experimental foundation began to emerge, with some of the key papers published over the course of the last decade. Researchers at UC Santa Cruz have explored the structure of and have discovered ways to form new beta rippled sheets.