Top White Papers

Driving competitive advantage by predicting the future

Competitive advantage is derived by an organization when it develops strategies, techniques, or resources that allow it to outperform...

Continue Reading Here

The business case for earlier software defect detection and compliance

Regardless of the industry your business operates in, software is likely all around it. Software powers our cars, airplanes, and even the medical devices we rely on to diagnose and treat illness...

Continue Reading Here

Can connected cars be secure cars?

There is growing awareness and concern over software security in the automobile industry. Learn about the challenges facing the industry today and what can be done to boost software security.

Continue Reading Here

Open source software - security risks and best practices

Understanding the risks associated with open source in general, and the security profile of specific projects, can help organizations minimize their total cost of ownership.

Continue Reading Here

White Papers

Software as a process

Today’s software products are the result of many suppliers, vendors, open source repositories, and legacy code coming together in a mix of different processes, standards, and cultures. Each input offers a chance to introduce safety, security, or performance-related errors.

This paper explains the challenges of this polyglot environment and how strategies and tools proven in a number of industries can be applied to your organization to reduce defects, meet requirements, and minimize costs.

Continue Reading Here

Coding to standards and quality: supply-chain application development

The monolithic codebase is dead. Modern applications are built of code from a variety of sources including employees, partners, and contractors from different geographies, with different skill levels, and working on a number of platforms. Application development is a supply chain, with dependencies supported by a network of systems ranging from greenfield development to legacy integrations, and utilizing a patchwork of code from custom, open-source, and commercial third-party sources. Ensuring consistency, security, and standards in such an environment can be challenging, but is essential for maintaining reputation, relationships, and customers.

Continue Reading Here

Time series analysis Auto Arima

Time series analysis is used by many industries in order to extract meaningful statistics, characteristics, and insights. Businesses use time series to improve business performance or mitigate risk in applications such as finance, weather prediction, cell tower capacity planning, pattern recognition, signal processing, and engineering.

Continue Reading Here

TotalView CUDA

CUDA introduces developers to a number of new concepts (such as kernels, streams, warps, and explicitly multilevel memory) that are not encountered in serial or other parallel programming paradigms. Visibility into these elements is critical for troubleshooting and tuning applications that make use of CUDA. This paper will highlight CUDA concepts implemented in CUDA 3.0 - 4.0, the impact of those concepts for troubleshooting CUDA, and how TotalView helps users deal with these new CUDA-specific constructs. CUDA is frequently used alongside MPI parallelism and host-side multicore and multithread parallelism. The TotalView parallel debugger provides developers with an integrated view of all three levels of parallelism within a single debugging session.

Continue Reading Here

Transitioning to multicore: Part II

Multicore systems are ubiquitous; it’s virtually impossible to buy even commodity computers without a dual, quad, or hex-core processor. It won’t be long before many-core processors start to be prevalent as well. Each core in a multicore processor is capable of executing a program, so a quad-core processor can run four separate programs at the same time. That’s great if you have many different programs you need to run at one time, but can become a problem when you need performance from a single program. Those four cores can also potentially run one program faster than a single core processor would, but only if the program is written correctly. If you run a sequential (or serial) program written for single core architectures on a multicore platform, it will generally only be able to leverage a single core. Serial programs don’t run any faster, and may even run slightly slower, on multicore processors.

Continue Reading Here

Many integrated core debugging

Intel® Xeon® Phi™ coprocessors present an exciting opportunity for HPC developers to take advantage of many-core processor technology. Since the Intel Xeon Phi coprocessor shares many architectural features and much of the development tool chain with multicore Intel Xeon processors, it is generally fairly easy to migrate a program to the Intel Xeon Phi coprocessor. However, to fully leverage the Intel Xeon Phi coprocessor, a new level of parallelism needs to be expressed which may require significantly rethinking algorithms. Scientists need tools that support debugging and optimizing hybrid MPI/OpenMP parallel applications that may have dozens or even hundreds of threads per node.

Continue Reading Here

Current visual data analysis techniques

The term visualization was first coined in the July 1987 Visualization in Scientific Computing (ViSC) report. The National Science Foundation defines it as “a method of computing that offers a way for seeing the unseen. It enriches the process of discovery and fosters profound and unexpected insights.” However, current visualization techniques, such as realistic rendering, are primarily geared to geometric sources of data such as CAD and two- and three-dimensional modeling sources.

Continue Reading Here

The value of visual data analysis

Decisions have to be made based on data that are voluminous, complex, and coming at a very rapid rate. The raw numbers, in records or tables, are overwhelming. That’s why many engineers and researchers are turning to visual data analysis (VDA) software, such as PV-WAVE, to better analyze their data. These tools combine state-of-the-art graphics, data access, data management, and analytical techniques into a highly interactive environment that can interpret large amounts of data quickly.

Continue Reading Here

Displaying results 1-10 (of 84)
 |<  < 1 - 2 - 3 - 4 - 5 - 6 - 7 - 8 - 9  >  >|