Showing posts with label Computer Software. Show all posts
Showing posts with label Computer Software. Show all posts

Sunday, 15 January 2017

Liquid silicon: Multi-duty computer chips could bridge the gap between computation and storage

Multi-duty computer chips could bridge the gap between computation and storage


Computer chips in development at the University of Wisconsin-Madison could make future computers more efficient and powerful by combining tasks usually kept separate by design.
Jing Li, an assistant professor of electrical and computer engineering at UW-Madison, is creating computer chips that can be configured to perform complex calculations and store massive amounts of information within the same integrated unit -- and communicate efficiently with other chips. She calls them "liquid silicon."

"Liquid means software and silicon means hardware. It is a collaborative software/hardware technique," says Li. "You can have a supercomputer in a box if you want. We want to target a lot of very interesting and data-intensive applications, including facial or voice recognition, natural language processing, and graph analytics."

The high-speed number-crunching of processors and the data warehousing of big storage memory in modern computers usually fall to two entirely different types of hardware.

"There's a huge bottleneck when classical computers need to move data between memory and processor," says Li. "We're building a unified hardware that can bridge the gap between computation and storage."

Processor and memory chips are typically separately produced by different manufacturing foundries, then assembled together by system engineers on printed circuit boards to make computers and smartphones. The separation means even simple operations, like searches, require multiple steps to accomplish: first fetching data from the memory, then sending that data all the way through the deep storage hierarchy to the processor core.

The chips Li is developing, by contrast, incorporate memory, computation and communication into the same device using a layered design called monolithic 3D integration: silicon and semiconductor circuitry on the bottom connected with solid-state memory arrays on the top using dense metal-to-metal links. End users will be able to configure the devices to allocate more or fewer resources to memory or computation, depending on what types of applications a system needs to run.

"It can be dynamic and flexible," says Li. "We originally worried it might be too hard to use because there are too many options. But with proper optimization, anyone can take advantage of the rich flexibility offered by our hardware."

To help people harness the new chip's potential, Li's group also is developing software that translates popular programming languages into the chip's machine code, a process called compilation.

"If I just handed you something and said, 'This is a supercomputer in a box,' you might not be able to use it if the programming interface is too difficult," says Li. "You cannot imagine people programming in terms of binary zeroes and ones. It would be too painful."

Thanks to her compilation software, programmers will be able to port their applications directly onto the new type of hardware without changing their coding habits.

To evaluate the performance of prototype liquid silicon chips, Li and her students established an automated testing system they built from scratch. The platform can reveal reliability problems better than even the most advanced industry testing, and multiple companies have sent their chips to Li for evaluation.

Given that testing accounts for more than half the consumer cost of computer chips, having such advanced infrastructure at UW-Madison can help make liquid silicon chips a reality and facilitate future research.

"We can do all types of device-level, circuit-level and system-level testing with our platform," says Li. "Our industry partners told us that our testing system does the entire job of a test engineer automatically."


Friday, 13 January 2017

Fighting computer viruses isn't just for software anymore


Researchers want to use hardware to fight computer viruses


Fighting computer viruses isn't just for software anymore. Binghamton University researchers will use a grant from the National Science Foundation to study how hardware can help protect computers too.
"The impact will potentially be felt in all computing domains, from mobile to clouds," said Dmitry Ponomarev, professor of computer science at Binghamton University, State University of New York. Ponomarev is the principal investigator of a project titled "Practical Hardware-Assisted Always-On Malware Detection."

More than 317 million pieces of new malware -- computer viruses, spyware, and other malicious programs -- were created in 2014 alone, according to work done by Internet security teams at Symantec and Verizon. Malware is growing in complexity, with crimes such as digital extortion (a hacker steals files or locks a computer and demands a ransom for decryption keys) becoming large avenues of cyber attack.

"This project holds the promise of significantly impacting an area of critical national need to help secure systems against the expanding threats of malware," said Ponomarev. "[It is] a new approach to improve the effectiveness of malware detection and to allow systems to be protected continuously without requiring the large resource investment needed by software monitors."

Countering threats has traditionally been left solely to software programs, but Binghamton researchers want to modify a computer's central processing unit (CPU) chip -- essentially, the machine's brain -- by adding logic to check for anomalies while running a program like Microsoft Word. If an anomaly is spotted, the hardware will alert more robust software programs to check out the problem. The hardware won't be right about suspicious activity 100 percent of the time, but since the hardware is acting as a lookout at a post that has never been monitored before, it will improve the overall effectiveness and efficiency of malware detection.

"The modified microprocessor will have the ability to detect malware as programs execute by analyzing the execution statistics over a window of execution," said Ponomarev. "Since the hardware detector is not 100-percent accurate, the alarm will trigger the execution of a heavy-weight software detector to carefully inspect suspicious programs. The software detector will make the final decision. The hardware guides the operation of the software; without the hardware the software will be too slow to work on all programs all the time."

The modified CPU will use low complexity machine learning -- the ability to learn without being explicitly programmed -- to classify malware from normal programs, which is Yu's primary area of expertise.

"The detector is, essentially, like a canary in a coal mine to warn software programs when there is a problem," said Ponomarev. "The hardware detector is fast, but is less flexible and comprehensive. The hardware detector's role is to find suspicious behavior and better direct the efforts of the software."

Much of the work -- including exploration of the trade-offs of design complexity, detection accuracy, performance and power consumption -- will be done in collaboration with former Binghamton professor Nael Abu-Ghazaleh, who moved on to the University of California-Riverside in 2014.

Lei Yu, associate professor of computer science at Binghamton University, is a co-principal investigator of the grant.

Grant funding will support graduate students that will work on the project both in Binghamton and California, conference travel and the investigation itself. The three-year grant is for $275,000

Thursday, 12 January 2017

New software continuously scrambles code to foil cyber attacks


Technique sets a deadline on hackers to severely limit chances of success





As long as humans are writing software, there will be coding mistakes for malicious hackers to exploit. A single bug can open the door to attackers deleting files, copying credit card numbers or carrying out political mischief.
A new program called Shuffler tries to preempt such attacks by allowing programs to continuously scramble their code as they run, effectively closing the window of opportunity for an attack. The technique is described in a study presented this month at the USENIX Symposium on Operating Systems and Design (OSDI) in Savannah, Ga.

"Shuffler makes it nearly impossible to turn a bug into a functioning attack, defending software developers from their mistakes," said the study's lead author, David Williams-King, a graduate student at Columbia Engineering. "Attackers are unable to figure out the program's layout if the code keeps changing."

Even after repeated debugging, software typically contains up to 50 errors per 1,000 lines of code, each a potential avenue for attack. Though security defenses are constantly evolving, attackers are quick to find new ways in.

In the early 2000s, computer operating systems adopted a security feature called address space layout randomization, or ASLR. This technique rearranges memory when a program launches, making it harder for hackers to find and reuse existing code to take over the machine. But hackers soon discovered they could exploit memory disclosure bugs to grab code fragments once the program was already running.

Shuffler was developed to deflect this latter style of code-reuse attack. It takes ASLR's code-scrambling approach to the extreme by randomizing small blocks of code every 20 to 50 milliseconds, imposing a severe deadline on would-be attackers. Until now, shifting around running code as a security measure was thought to be technically impractical because existing solutions require specialized hardware or software.

In the above demo, "#"s represent code in memory as a typical web server runs. When the server shifts to running with Shuffler, the '#'s move every 50 milliseconds. The shuffled web server serves the web page seen at the end of the demo.

"By the time the server returns the information the attacker needs, it is already invalid -- Shuffler has already relocated the respective code snippets to different memory locations," said study coauthor Vasileios Kemerlis, a computer science professor at Brown University.

Designed to be user-friendly, Shuffler runs alongside the code it defends, without modifications to program compilers or the computer's operating system. It even randomizes itself to defend against possible bugs in its own code.

The researchers say Shuffler runs faster and requires fewer system changes than similar continuous-randomization software such TASR and Remix, developed at MIT Lincoln Labs and Florida State University respectively.

As an invitation to other researchers to try and break Shuffler, Williams-King is currently running the software on his personal website. (He can check that the code is shuffling and whether anyone has attacked the site by reviewing the program's logs).

On computation-heavy workloads, Shuffler slows programs by 15 percent on average, but at larger scales -- a webserver running on 12 CPU cores, for example -- the drop in performance is negligible, the researchers say.

This versatility means that software distributors as well as security-conscious individuals could be potential end users. "It's the first system that is trying to be a serious defense that people can use, right now," said Williams-King.

Shuffler needs a few last improvements before it is made public. The researchers say they want to make it easier to use on software they haven't yet tested. They also want to improve Shuffler's ability to defend against exploits that take advantage of server-crashes.

"Billions of lines of vulnerable code are out there," said the study's senior author, Junfeng Yang, a computer science professor at Columbia Engineering and member of the Data Science Institute. "Rather than finding every bug or rewriting all billions of lines of code in safer languages, Shuffler instantly lets us build a stronger defense."


Wednesday, 11 January 2017

Streamlining the Internet of Things and other cyber-physical systems

Streamlining the Internet of Things 

Sometimes referred to as the Internet of Things, cyber-physical systems vary from phones to self-driving cars, from airplane controls to home energy meters. They are both touchable objects and invisible code. However, as streamlined as cyber-physical systems appear, the technology developed within manufacturing systems that were not designed to accommodate it.
To change that, researchers banded together from Michigan Technological University, Boston University, University of California, Berkeley, and University of California, Riverside. Their latest work, a keynote paper published in IEEE Transactions in CAD, lays the groundwork for better design in cyber-physical systems.
"The register-transfer-level (RTL) design flow for digital circuits is one of the major success stories in electronic design automation," the authors write. "Will a durable design methodology, such as the RTL design flow, emerge for cyber-physical systems?"
The answer, they say, depends on how well cross-disciplinary teams learn to manage heterogeneous and dynamic technologies across large scales while accounting for human users.

Reframing Automation Design

A better cyber-physical design system comes down to the nuts and bolts of new technology -- but it's not as simple as separating out the mechanical and digital pieces. Cyber-physical systems are not just the interface of the two; together they create an emergent space with new properties and challenges.
Take a smart grid, for example. Much like a cake is more than the sum of the sugar and flour in its recipe, a smart grid is more than a home energy meter, powerlines, a power plant control center and software. The pieces are networked in order to predict, adjust and assess production and consumption in close to real-time.
"Sensors transmit data to the physical system, which reads the data and takes action," says Shiyan Hu, an associate professor of computer engineering at Michigan Tech and one of the study co-authors. Usually, this transfer is streamlined and efficient, but as a cybersecurity expert, Hu knows the exchange is a weak link.

Cybersecurity

"Security and privacy have become two of the foremost design concerns for cyber-physical systems today," the team writes, "and they cannot just be bolted on as an afterthought."
Hu says hiring specialized experts at each stage of the design and manufacturing process is essential. There is no one-size-fits-all kind of cybersecurity. For example, in a self-driving car, from the central operating system to a smart phone Bluetooth connection to the anti-lock brakes, each device and software needs a tailored design.
"Cybersecurity is part of this, but more -- cyber-physical security impacts not just the network, it impacts the whole system, including the physical objects," Hu explains.
The key to making safe, dependable and innovative technologies, Hu and his team explain, is to embrace big data. They encourage combining model-based design with data-based learning: in other words, merge two existing paradigms into one practice. The result could help establish the RTL design flow equivalent in cyber-physical systems. Streamlining also incorporates machine learning, real-time sensors, effective communication interfaces and human-centric strategies -- simply a smarter way to approach the design process.