Top White Papers

A fast, scalable solution for solving the transportation problem

While the family of transportation problems can be solved by hand, at least for relatively small problems, the IMSL Library includes an algorithm that is fast and scalable.

Continue Reading Here

7 questions to select, deploy, and maintain open source software effectively

A question that often comes up in enterprise software organizations is: What happens when open source software doesn’t work?

Continue Reading Here

Prioritize defects faster with Klocwork SmartRank

The faster a bug is identified in code, the easier it is to fix. Klocwork SmartRank helps identify which issues to fix first.

Continue Reading Here

Enterprise API management defined

Defining API management and the best path to quickly capitalize on new revenue channels without assuming risk along the way.

Continue Reading Here

White Papers

Many integrated core debugging

Intel® Xeon® Phi™ coprocessors present an exciting opportunity for HPC developers to take advantage of many-core processor technology. Since the Intel Xeon Phi coprocessor shares many architectural features and much of the development tool chain with multicore Intel Xeon processors, it is generally fairly easy to migrate a program to the Intel Xeon Phi coprocessor. However, to fully leverage the Intel Xeon Phi coprocessor, a new level of parallelism needs to be expressed which may require significantly rethinking algorithms. Scientists need tools that support debugging and optimizing hybrid MPI/OpenMP parallel applications that may have dozens or even hundreds of threads per node.

Continue Reading Here

Current visual data analysis techniques

The term visualization was first coined in the July 1987 Visualization in Scientific Computing (ViSC) report. The National Science Foundation defines it as “a method of computing that offers a way for seeing the unseen. It enriches the process of discovery and fosters profound and unexpected insights.” However, current visualization techniques, such as realistic rendering, are primarily geared to geometric sources of data such as CAD and two- and three-dimensional modeling sources.

Continue Reading Here

The value of visual data analysis

Decisions have to be made based on data that are voluminous, complex, and coming at a very rapid rate. The raw numbers, in records or tables, are overwhelming. That’s why many engineers and researchers are turning to visual data analysis (VDA) software, such as PV-WAVE, to better analyze their data. These tools combine state-of-the-art graphics, data access, data management, and analytical techniques into a highly interactive environment that can interpret large amounts of data quickly.

Continue Reading Here

VDA tool technology unleashed

For many PV-WAVE programmers, their exposure to VDA tools consists of running the navigator and using it as an interactive point-and-click method of accessing PV-WAVE functionality. This is the intention of the navigator certainly, but only one way of making use of the underlying technology. Some users also may invoke the individual VDA tools (Wz routines) interactively, such as with the command: "WzPlot, DIST(10)". This is another way of using VDA tools in an ad-hoc, interactive way.

Continue Reading Here

VDA tool technology

The VDA tool architecture is a framework for developing cross-platform applications with graphic user interfaces (GUIs) in PV-WAVE. It is intended to put the VDA tool architecture into context and position the PV-WAVE environment as a development tool.

Continue Reading Here

Validating C++ translations of MATLAB models with TotalView

This paper describes one of the many ways that developers have been able to use TotalView to enable the development of cutting edge applications. Frequently, applications in HPC are developed in stages. Problem domains such as signal processing and data analysis, applied physics modeling (geophysics, meteorology, astrophysics, computational chemistry), digital content creation, and financial analysis each have highly specialized computational challenges that call on the skills of domain specialists with deep experience in the subject matter. The algorithms, computational kernels, and other modules written by these domain specialists often are integrated together by software engineers and computational scientists who are specialists in areas like modern software component architectures, parallelism, and embedded development. This division of labor sets the stage for the creation of sophisticated applications that no single developer could have written.

Continue Reading Here

Leveraging the NVIDIA CUDA BLAS in the IMSL Fortran Numerical Library

In recent years, traditional high-performance hardware has been supplemented with graphic processing units once utilized only for 3D visualization. These general purpose graphics processing units (GPGPUs) have matured enough that BLAS packages are now available and both single and double-precision calculations are supported. These two facts indicate the environment has reached a maturity level high enough for general purpose libraries such as IMSL to consider leveraging the hardware.

Continue Reading Here

Leveraging high performance software in the IMSL C Numerical Library

In recent years, traditional high-performance hardware has been supplemented with graphic processing units once utilized only for 3D visualization. These general purpose graphics processing units (GPGPUs) have matured enough that function packages are now available and both single and double-precision calculations are supported. These two facts indicate the environment has reached a maturity level high enough for general purpose libraries such as IMSL to consider leveraging the hardware.

Continue Reading Here

Leveraging a .NET Numerical Library with Microsoft Excel

Extending the functionality found in Microsoft Excel. Specialized mathematical and statistical libraries can extend analysis techniques well beyond any spreadsheet tool. However, spreadsheets are very popular because of their ease of use in organizing data and obtaining results quickly. The combination of an advanced numerical library and a spreadsheet with extensibility features lets a developer use a common interface for powerful numerical analysis.

Continue Reading Here

Displaying results 81-90 (of 186)
 |<  <  5 - 6 - 7 - 8 - 9 - 10 - 11 - 12 - 13 - 14  >  >| 
Category Tags
academic aerospace agile development Akana case study Akana datasheet Akana Lifecycle Manager Akana Platform Akana video Akana white paper API analytics API development API Gateway API management API microservices API portal API security Apigility automotive brochure brochures case study CentOS cloud CodeDynamics CodeDynamics video coding standards continuous delivery continuous integration datasheet debugging defect detection developer productivity DevOps digital media dynamic analysis ecommerce Elixir embedded analytics energy & environment enterprise php Expressive Family finance government HostAccess how-to HydraExpress IBM i IBM WebSphere DataPower IMSL IMSL C Library IMSL Family IMSL Fortran Library IMSL Java Library IMSL NET Library infographic insurance JMSL JViews Klocwork language: c language: java legal & compliance license life sciences & healthcare manufacturing MemoryScape MISRA mobile network and telecom equipment NoSQL OAuth OEM & ISV open banking open source Open Source Audit open source management open source support OpenLogic Other Industries php presentation product publishing PV-WAVE PyIMSL Studio Python real estate ReplayEngine research REST retail scanning & governance security Semantics Manager services SmartRank SOA SOAP software policy software quality software security software verification SOLA source code analysis SourcePro static analysis static code analysis Stingray Studio supply chain support support & services survey results telecom TotalView training transportation video Views Visualization web development white paper Zend Zend case study Zend datasheet Zend Framework Zend Guard Zend Server Zend Studio Zend video Zend white paper Z-Ray