Browse Category: Computer > Software

[Search within category]

Compact Key with Reusable Common Key for Encryption

A major aim of the field of cryptography is to design cryptosystems that is both provably secure and practical. Symmetric-key (private-key) methods have traditionally been viewed as practical in terms of typically a smaller key size, which means less storage requirements, and also faster processing. This, however, opens the protocols up to certain vulnerabilities, such as brute-force attacks. To reduce risk, the cryptographic keys are made longer, which in turn adds overhead burden and makes the scheme less practical. One-time pad (OTP) is a symmetric-type encryption technique that cannot be cracked, but requires the use of a single-use pre-shared key that is larger than or equal to the size of the message being sent. In this technique, a plaintext is paired with a random secret key (also referred to as OTP). Asymmetric-type (public-key, asymptotic) frameworks use pairs of keys consisting of a public and private key, and these models depend heavily on the privacy of the non-public key. Asymmetric-based protocols are generally much slower than symmetric approaches in practice. Hypertext Transfer Protocol Secure (HTTPS) protocol which is the backbone of internet security uses the Transport Layer Security (TLS) protocol stack in Transmission Control Protocol / Internet Protocol (TCP/IP) for secure and private data transfer. TLS is a protocol suite that uses a myriad of other protocols to guarantee security. Many of these subprotocols consume a lot of CPU power and are complex processes which are not optimized for big data applications. TLS uses public-key cryptography paradigms to exchange the keys between the communicating parties through the TLS handshake protocol. Unfortunately, traditional cryptographic algorithms and protocols (including schemes above and incorporating TLS, RSA, and AES) are not well suited in big data applications, as they need to perform a significant number of computations in practice. In turn, cloud providers face increasing CPU processing times and power usage to appropriately maintain services. In the modern computing era with quantum architecture and increased access to network and cloud resources, the speed and integrity of such outmoded cryptographic models will be put to the test.

Extra-Compact Key with Reusable Common Key for Encryption

A major aim of the field of cryptography is to design cryptosystems that is both provably secure and practical. Symmetric-key (private-key) methods have traditionally been viewed as practical in terms of typically a smaller key size, which means less storage requirements, and also faster processing. This, however, opens the protocols up to certain vulnerabilities, such as brute-force attacks. To reduce risk, the cryptographic keys are made longer, which in turn adds overhead burden and makes the scheme less practical. One-time pad (OTP) is a symmetric-type encryption technique that cannot be cracked, but requires the use of a single-use pre-shared key that is larger than or equal to the size of the message being sent. In this technique, a plaintext is paired with a random secret key (also referred to as OTP). Asymmetric-type (public-key, asymptotic) frameworks use pairs of keys consisting of a public and private key, and these models depend heavily on the privacy of the non-public key. Asymmetric-based protocols are generally much slower than symmetric approaches in practice. Hypertext Transfer Protocol Secure (HTTPS) protocol which is the backbone of internet security uses the Transport Layer Security (TLS) protocol stack in Transmission Control Protocol / Internet Protocol (TCP/IP) for secure and private data transfer. TLS is a protocol suite that uses a myriad of other protocols to guarantee security. Many of these subprotocols consume a lot of CPU power and are complex processes which are not optimized for big data applications. TLS uses public-key cryptography paradigms to exchange the keys between the communicating parties through the TLS handshake protocol. Unfortunately, traditional cryptographic algorithms and protocols (including schemes above and incorporating TLS, RSA, and AES) are not well suited in big data applications, as they need to perform a significant number of computations in practice. In turn, cloud providers face increasing CPU processing times and power usage to appropriately maintain services. In the modern computing era with quantum architecture and increased access to network and cloud resources, the speed and integrity of such outmoded cryptographic models will be put to the test.

Cross-Layer Device Fingerprinting System and Methods

Networks of connectivity-enabled devices, known as internet of things or IoT, involve interrelated devices that connect and exchange data with other IoT devices and the cloud. As the number of IoT devices and their applications continue to significantly increase, managing and administering edge and access networks have become increasingly more challenging. Currently, there are approximately 31 billion ‘‘things’’ connected to the internet, with a projected rise to 75 billion devices by 2025. Because of IoT interconnectivity and ubiquitous device use, assessing the risks, designing/specifying what’s reasonable, and implementing controls can be overwhelming to conventional frameworks. Any approach to better IoT network security, for example by improved detection and denial or restriction of access by unauthorized devices, must consider its impact on performance such as speed, power use, interoperability, and scalability. The IoT network’s physical and MAC layers are not impenetrable and have many known threats, especially identity-based attacks such as MAC spoofing events. Common network infrastructure uses WPA2 or IEEE 802.11i to help protect users and their devices and connected infrastructure. However, the risk of MAC spoofing remains, as bad actors leverage public tools on 802.11 commodity hardware, or intercept sensitive data packets at scale, to access users physical layer data, and can lead to wider tampering and manipulation of hardware-level parameters.

Telehealth-Mediated Physical Rehabilitation Systems and Methods

The use of telemedicine/telehealth increased substantially during the COVID-19 pandemic, leading to its accelerated development, utilization and acceptability. Telehealth momentum with patients, providers, and other stakeholders will likely continue, which will further promote its safe and evidence-based use. Improved healthcare by telehealth has also extended to musculoskeletal care. In a recent study looking at implementation of telehealth physical therapy in response to COVID-19, almost 95% of participants felt satisfied with the outcome they received from the telehealth physical therapy (PT) services, and over 90% expressed willingness to attend another telehealth session. While telehealth has enhanced accessibility by virtual patient visits, certain physical rehabilitation largely depends on physical facility and tools for evaluation and therapy. For example, limb kinematics in PT with respect to the shoulder joint is difficult to evaluate remotely, because the structure of the shoulder allows for tri-planar movement that cannot be estimated by simple single plane joint models. With the emergence of gaming technologies, such as videogames and virtual reality (VR), comes new potential tools for virtual-based physical rehabilitation protocols. Some research has shown digital game environments, and associated peripherals like immersive VR (iVR) headsets, can provide a powerful medium and motivator for physical exercise. And while low-cost motion tracking systems exist to match user movement in the real world to that in the virtual environment, challenges remain in bridging traditional PT tooling and telehealth-friendly physical rehabilitation.

Software Of Predictive Scheduling For Crop-Transport Robots Acting As Harvest-Aids During Manual Harvesting

Researchers at the University of California, Davis have developed an automated harvesting system using predictive scheduling for crop-transport robots, reducing manual labor, and increasing harvesting efficiency.

MR-Based Electrical Property Reconstruction Using Physics-Informed Neural Networks

Electrical properties (EP), such as permittivity and conductivity, dictate the interactions between electromagnetic waves and biological tissue. EP are biomarkers for pathology characterization, such as cancer. Imaging of EP helps monitor the health of the tissue and can provide important information in therapeutic procedures. Magnetic resonance (MR)-based electrical properties tomography (MR-EPT) uses MR measurements, such as the magnetic transmit field B1+, to reconstruct EP. These reconstructions rely on the calculations of spatial derivatives of the measured B1+. However, the numerical approximation of derivatives leads to noise amplifications introducing errors and artifacts in the reconstructions. Recently, a supervised learning-based method (DL-EPT) has been introduced to reconstruct robust EP maps from noisy measurements. Still, the pattern-matching nature of this method does not allow it to generalize for new samples since the network’s training is done on a limited number of simulated data pairs, which makes it unrealistic in clinical applications. Thus, there is a need for a robust and realistic method for EP map construction.

Universal Patient Monitoring

Sensor-based patient monitoring is a promising approach to assess risk, which can then be used by healthcare clinics to focus efforts on the highest-risk patients without having to spend the time manually assessing risk. For example, pressure ulcers/injuries are localized damage to the skin and/or underlying tissue that usually occur over a bony prominence and are most common to develop in individuals who have low-mobility, such as those who are bedridden or confined to a wheelchair and consequently are attributed to some combination of pressure, friction, shear force, temperature, humidity, and restriction of blood flow and are more prevalent in patients with chronic health problems. Sensor-based patient monitoring can be tuned to the individual based on the relative sensor readings. However, existing sensor-based monitoring techniques, such as pressure monitoring, are one-off solutions that are not supported by a comprehensive system which integrates sensing, data collection, storage, data analysis, and visualization. While traditional monitoring solutions are suitable for its intended purpose, these approaches require substantial re-programming as the suites of monitoring sensors change over time.

Dynamically Tuning IEEE 802.11 Contention Window Using Machine Learning

The exchange of information among nodes in a communications network is based upon the transmission of discrete packets of data from a transmitter to a receiver over a carrier according to one or more of many well-known, new or still developing protocols. In this context, a protocol consists of a set of rules defining how the nodes interact with each other based on information sent over the communication links. Often, multiple nodes will transmit a packet at the same time and a collision occurs. During a collision, the packets are disrupted and become unintelligible to the other devices listening to the carrier activity. In addition to packet loss, network performance is greatly impacted. The delay introduced by the need to retransmit the packets cascades throughout the network to the other devices waiting to transmit over the carrier. Therefore, packet collision has a multiplicative effect that is detrimental to communications networks. As a result, multiple international protocols have been developed to address packet collision, including collision detection and avoidance. Within the context of wired Ethernet networks, the issue of packet collision has been largely addressed by network protocols that try to detect a packet collision and then wait until the carrier is clear to retransmit. Emphasis is placed in collision detection, i.e., a transmitting node can determine whether a collision has occurred by sensing the carrier. At the same time, the nature of wireless networks prevents wireless nodes from being able to detect a collision. This is the case, in part, because in wireless networks the nodes can send and receive but cannot sense packets traversing the carrier after the transmission has started. Another problem arises when two transmitting nodes are out of range of each other, but the receiving node is within range of both. In this case, a transmitting node cannot sense another transmitting node that is out of communications range. IEEE 802.11 protocols are the basis for wireless network products using the Wi-Fi brand and are the world's most widely used wireless computer networking standards. With IEEE 802.11 packet collision features come deficiencies, like fairness. 802.11’s approach to certain parameters after each successful transmission may cause the node who succeeds in transmitting to dominate the channel for an arbitrarily long period of time. As a result, other nodes may suffer from severe short-term unfairness. Also, the current state of the network (e.g., load) is something that also should be factored. In general, there is a need for techniques to recognize network patterns and determine certain parameters that are responsive to those network patterns.

Yarn-based algorithm for generating realistic cloth renderings

Researchers at UC Irvine have developed an efficient algorithm for generating computer-rendered textiles with fiber-level details and easy editability. This technology greatly enhances the richness of virtual fabric models and has the potential to impact various industries such as online retail, textile design, videogames, and animated movies.

Biological and Hybrid Neural Networks Communication

During initial stages of development, the human brain self assembles from a vast network of billions of neurons into a system capable of sophisticated cognitive behaviors. The human brain maintains these capabilities over a lifetime of homeostasis, and neuroscience helps us explore the brain’s capabilities. The pace of progress in neuroscience depends on experimental toolkits available to researchers. New tools are required to explore new forms of experiments and to achieve better statistical certainty.Significant challenges remain in modern neuroscience in terms of unifying processes at the macroscopic and microscopic scale. Recently, brain organoids, three-dimensional neural tissue structures generated from human stem cells, are being used to model neural development and connectivity. Organoids are more realistic than two-dimensional cultures, recapitulating the brain, which is inherently three-dimensional. While progress has been made studying large-scale brain patterns or behaviors, as well as understanding the brain at a cellular level, it’s still unclear how smaller neural interactions (e.g., on the order of 10,000 cells) create meaningful cognition. Furthermore, systems for interrogation, observation, and data acquisition for such in vitro cultures, in addition to streaming data online to link with these analysis infrastructures, remains a challenge.

Software Tool for Predicting Sequences in a Genome that are Subject to Restriction or Other Surveillance Mechanisms

Many genomes encode Restriction-Modification systems (RMs) that act to protect the host cell from invading DNA by cutting at specific sites (frequently short 4-6 base reverse complement palindromes). RMs also protect host DNA from unfavorably being cut by modifying sites within the host DNA that could be targets by the host’s own surveillance enzymes. It is also not unusual to find that these enzymes are adjacent to each other in the host genome. Traditional approaches to understanding these sites involve finding a methylase that is typically adjacent to a restriction enzyme, and then extracting DNA, expressing protein and then testing DNA sequence for evidence of cutting. In certain laboratory research (e.g., programs that involve transforming DNA/RNA) it may be desirable to more comprehensively understand the sequences being surveilled by the host. Moreover, it may be desirable in certain laboratory research to know/predict which surveillance enzymes are present in a genome in order to affect cell transformation efficiency through evasion of those sequences.

Method To Inverse Design Mechanical Behaviors Using Artificial Intelligence

Metamaterials are constructed from regular patterns of simpler constituents known as unit cells. These engineered metamaterials can exhibit exotic mechanical properties not found in naturally occurring materials, and accordingly they have the potential for use in a variety of applications from running shoe soles to automobile crumple zones to airplane wings. Practical design using metamaterials requires the specification of the desired mechanical properties based on understanding the precise unit cell structure and repeating pattern. Traditional design approaches, however, are often unable to take advantage of the full range of possible stress-strain relationships, as they are hampered by significant nonlinear behavior, process-dependent manufacturing errors, and the interplay between multiple competing design objectives. To solve these problems, researchers at UC Berkeley have developed a machine learning algorithm in which designers input a desired stress-strain curve that encodes the mechanical properties of a material. Within seconds, the algorithm outputs the digital design of a metamaterial that, once printed, fully encapsulates the desired properties from the inputted stress-strain curve. This algorithm produces results with a fidelity to the desired curve in excess of 90%, and can reproduce a variety of complex phenomena completely inaccessible to existing methods.

Methods and Systems for Large Group Chat Conversations

In today’s modern computing environment, the growth of internet speeds and web-friendly devices have enabled a newer generation of telecommunication technology and practice. Electronic chat (messaging) applications have become a common tool for both synchronous and asynchronous communication because of their ease of use and flexibility. Electronic group chat has also become a common tool to facilitate group discussion, including teaching, mentoring, and decision-making. Group chat is a feature in many popular business and social apps that support audio/video web-conferencing, including Zoom, Google, Microsoft, and Facebook. Typical web-conference software may include a window containing sub-windows for a video, presentation, and/or group chat, etc. However, group chat today is limited in its ability to engage all users in a discussion, especially as the group size grows. In a large group chat, if users are all engaged, the resulting firehose of messages makes it impossible to have a coherent conversation. For conveners and participants alike, the results range from mild distraction to unstructured noise, leading people to disengage with the conversation and/or miss important messages, which limits the usefulness of any platform’s group chat feature.

Inertial Odometry System and Methods

Although GPS can be used for localization outdoors, indoor environments (office buildings, shopping malls, transit hubs) can be particularly challenging for many of the general population, and especially for blind walkers. GPS-denied environments have received considerable attention in recent years as our population’s digital expectations grow. To address GPS-denied environments, various services have been explored, including technology based on Bluetooth low energy (BLE), Wi-Fi, and camera. Drawbacks with these approaches are common, including calibration (fingerprinting) overhead using Wi-Fi, beacon infrastructure costs using BLE, and unoccluded visibility requirements in camera-based systems. While localization and wayfinding using inertial sensing overcomes these challenges, large errors with accumulated drift are known. Moreover, the decoupling of the orientation of the phone from the direction of walking, as well as accurately detecting walker’s velocity and detecting steps and measuring stride lengths, have also been challenges for traditional pedestrian dead reckoning (PDR) systems. Relatedly, blind walkers (especially those who do not use a dog guide) often tend to veer when attempting to walk in a straight line, and this unwanted veering may generate false turn detections with such inertial methods.

Robust Single Cell Classification Methods and System

High-throughput next-generation sequencing (NGS) systems have allowed for large scale collection of transcriptomic data with single cell resolution. Within this data lies variability allowing researchers to characterize and/or infer certain morphological aspects of interest, such as single cell type, cell state, cell growth trajectories, and inter-cellular gene regulatory networks. All of these qualities are important parts of understanding how cells interact with one another, both for building better cellular models in vitro and for understanding biological processes in vivo. While the size of single cell data has increased massively, NGS techniques for key pieces of analysis have not kept pace, using slow, manual pipelines of domain experts for initial clustering. Attempts to improve NGS classification performance have fallen short as the numbers of cell types (often asymmetric) and cell subtypes have increased while the number of samples per label has become small. The technical variability between NGS experiments can make robust classification between multiple tissue samples difficult. Moreover, the high-dimensional nature of NGS transcriptomic data makes this type of analysis statistically and computationally intractable.

Systems For Pulse-Mode Interrogation Of Wireless Backscatter Communication Nodes

Measurement of electrical activity in nervous tissue has many applications in medicine, but the implantation of a large number of sensors is traditionally very risky and costly. Devices must be large due to their necessary complexity and power requirements, driving up the risk further and discouraging adoption. To address these problems, researchers at UC Berkeley have developed devices and methods to allow small, very simple and power-efficient sensors to transmit information by backscatter feedback. That is, a much more complex and powerful external interrogator sends an electromagnetic or ultrasound signal, which is modulated by the sensor nodes and reflected back to the interrogator. Machine learning algorithms are then able to map the reflected signals to nervous activity. The asymmetric nature of this process allows most of the complexity to be offloaded to the external interrogator, which is not subject to the same constraints as implanted devices. This allows for larger networks of nodes which can generate higher resolution data at lower risks and costs than existing devices.

Spellcasters: Physical Therapy Re-Imagined

Almost 800,000 people suffer a stroke each year in the U.S. and approximately two-thirds survive and require rehabilitation. Stroke is a leading cause of serious long-term disability. Between 2017 and 2018, stroke-related costs in the U.S. were about $53B. This total includes the cost of health care services, medicines to treat stroke, and missed days of work. According to the U.S. National Institute of Neurological Disorders and Stroke, research shows the most important element in any neurorehabilitation program is carefully directed, well-focused, repetitive practice, which is the same kind of practice used by all people when they learn a new skill, such as playing the piano or pitching a baseball. The emergence of gaming technologies, such as videogames and virtual reality (VR), opens the door to a variety of possibilities for neurorehabilitation activities.

Techniques for Encryption based on Perfect Secrecy for Bounded Storage

A major aim of the field of cryptography is to design cryptosystems that are provably secure and practical. Factors such as integrity, confidentiality and authentication are important. Symmetric-key methods have traditionally been viewed as practical in terms of typically a smaller key size, which means less storage requirements, and also faster processing. This, however, opens the protocols up to certain vulnerabilities, such as brute-force attacks. To reduce risk, the cryptographic keys are made longer, which in turn adds overhead burden and makes the scheme less practical. Asymmetric-type frameworks use pairs of keys consisting of a public and private key, and these models depends heavily on the privacy of the non-public key. Asymmetric-based protocols are generally much slower than symmetric approaches. Symmetric-Asymmetric hybrid models have attempted to blend the speed and convenience of the public asymmetric encryption schemes with the effectiveness of a private symmetric encryption schemes. Examples of hybrids include GNU Privacy Guard, Advanced Encryption Standard-RSA, and Elliptical Curve Cryptography-RSA. In the modern computing era with quantum architecture and access to network and cloud resources on the rise, the integrity and confidentiality of such modern cryptographic models will increasingly be under pressure.

Livesynthesis: Towards An Interactive Synthesis Flow

In digital circuit design, synthesis is a tedious and time consuming task. Designers wait several hours for relatively small design changes to yield synthesis results. 

DESIGN WORKFLOW IMPROVEMENTS USING STRUCTURAL MATCHING FOR FAST RE-SYNTHESIS OF ELECTRONIC CIRCUITS

Electronic circuits are growing in complexity every year. Existing workflows that optimize the design and placement of circuit components are laborious and time-consuming though. Incremental design changes that target device optimization can take many hours to render. Streamlined design workflows that are both fast and able to optimize performance are needed to keep pace with these device improvements. A UC Santa Cruz researcher has developed a new technique, SMatch, to shorten design workflow times with minimal QoR impact. 

Variable Exposure Portable Perfusion Monitor

Brief description not available

  • Go to Page: