Award Abstract # 1541349

CC*DNI DIBBs: The Pacific Research Platform

NSF Org:
Office of Advanced Cyberinfrastructure (OAC)
Initial Amendment Date:July 30, 2015
Latest Amendment Date:June 4, 2021
Award Number:1541349
Award Instrument:Cooperative Agreement
Program Manager:Amy Walton
 Office of Advanced Cyberinfrastructure (OAC)
 Direct For Computer & Info Scie & Enginr
Start Date:October 1, 2015
End Date:September 30, 2022 (Estimated)
Total Intended Award Amount:$5,000,000.00
Total Awarded Amount to Date:$8,181,182.00
Funds Obligated to Date: FY 2015 = $5,000,000.00
FY 2018 = $1,149,262.00

FY 2019 = $16,000.00

FY 2020 = $1,015,968.00

FY 2021 = $999,952.00
History of Investigator:
  • Larry  Smarr (Principal Investigator)  
  • Frank  Wuerthwein (Co-Principal Investigator)
  • Camille  Crittenden (Co-Principal Investigator)
  • Thomas  DeFanti (Co-Principal Investigator)
  • Philip  Papadopoulos (Co-Principal Investigator)
Awardee Sponsored Research Office: University of California-San Diego
Office of Contract & Grant Admin
La Jolla
CA  US  92093-0934
Sponsor Congressional District: 49
Primary Place of Performance: University of California-San Diego
La Jolla
CA  US  92093-0934
Primary Place of Performance
Congressional District:
DUNS ID: 804355790
Parent DUNS ID: 071549000
Data Cyberinfrastructure
Primary Program Source: 040100 NSF RESEARCH & RELATED ACTIVIT



Program Reference Code(s): 7433, 8048, 9251
Program Element Code(s): 7231, 7726
Award Agency Code:4900
Fund Agency Code:4900
CFDA Number(s):47.070


Research in data-intensive fields is increasingly multi-investigator and multi-institutional, depending on ever more rapid access to ultra-large heterogeneous and widely distributed datasets. The Pacific Research Platform (PRP) is a multi-institutional extensible deployment that establishes a science-driven high-capacity data-centric ‘freeway system.’ The PRP spans all 10 campuses of the University of California, as well as the major California private research universities, four supercomputer centers, and several universities outside California. Fifteen multi-campus data-intensive application teams act as drivers of the PRP, providing feedback to the technical design staff over the five years of the project. These application areas include particle physics, astronomy/astrophysics, earth sciences, biomedicine, and scalable multimedia, providing models for many other applications.

The PRP builds on prior NSF and Department of Energy (DOE) investments. The basic model adopted by the PRP is ‘The Science DMZ,’ being prototyped by the DOE ESnet. (A Science DMZ is defined as ‘a portion of the network, built at or near the campus local network perimeter that is designed such that the equipment, configuration, and security policies are optimized for high-performance scientific applications rather than for general-purpose business systems’). In the last three years, NSF has funded over 100 U.S. campuses through Campus Cyberinfrastructure – Network Infrastructure and Engineering (CC-NIE) grants to aggressively upgrade their network capacity for greatly enhanced science data access, creating Science DMZs within each campus. The PRP partnership extends the NSF-funded campus Science DMZs to a regional model that allows high-speed data-intensive networking, facilitating researchers moving data between their laboratories and their collaborators’ sites, supercomputer centers or data repositories, and enabling that data to traverse multiple heterogeneous networks without performance degradation over campus, regional, national, and international distances. The PRP’s data sharing architecture, with end-to-end 10-40-100Gb/s connections, provides long-distance virtual co-location of data with computing resources, with enhanced security options.


  • Pandey, P.K., Chhabra, S. and Sharma, A. “An Observable Network Route Support on Interpretation of Cloud Computing.” International Journal of Innovative Technology and Exploring Engineering, , v.9 , 2020 , p.2

  • Heintz, A., Razavimaleki, V., Duarte, J., DeZoort, G., Ojalvo, I., Thais, S., Atkinson, M., Neubauer, M., Gray, L., Jindariani, S. and Tran, N. “Accelerated charged particle tracking with graph neural networks on FPGAs” arXiv preprint , 2020 , p.arXiv:201

  • Bharathkumar, K., Paolini, C. and Sarkar, M. “FPGA-based Edge Inferencing for Fall Detection.” 2020 IEEE Global Humanitarian Technology Conference (GHTC) , 2020

  • Fajardo, E., Wuerthwein, F., Bockelman, B., Livny, M., Thain, G., Clark, J.A., Couvares, P. and Willis, J. “Adapting LIGO workflows to run in the Open Science Grid” SoftwareX , v.14 , 2021

  • Zhang, Q., Xiao, T., Efros, A.A., Pinto, L. and Wang, X. “Learning Cross-Domain Correspondence for Control with Dynamics Cycle-Consistency” arXiv preprint , 2020 , p.arXiv:201

  • Nguyen, M.H., Block, J., Crawl, D., Siu, V., Bhatnagar, A., Rodriguez, F., Kwan, A., Baru, N.,and Altintas, I. “Land Cover Classification at the Wildland Urban Interface using High-Resolution Satellite Imagery and Deep Learning” 2018 IEEE International Conference on Big Data (Big Data) , 2018 

  • Boada, A., Paolini, C. and Castillo, J.E. “High-order mimetic finite differences for anisotropic elliptic equations.” Computers & Fluids , v.213 , 2020

  • Sfiligoi, I., Schultz, D., Riedel, B., Wuerthwein, F., Barnet, S. and Brik, V. “Demonstrating a Pre-Exascale, Cost-Effective Multi-Cloud Environment for Scientific Computing: Producing a fp32 ExaFLOP hour worth of IceCube simulation data in a single workday.” Practice and Experience in Advanced Research Computing , 2020 

  • Shourov, E.C. and Paolini, C. “Laying the Groundwork for Automated Computation of Surrogate Safety Measures (SSM) for Skateboarders and Pedestrians using Artificial Intelligence.” 2020 Third International Conference on Artificial Intelligence for Industries (AI4I) , 2020

  • Qin, Y., Rodero, I., Simonet, A., Meertens, C., Reiner, D., Riley, J. and Parashar, M. “Leveraging user access patterns and advanced cyberinfrastructure to accelerate data delivery from shared-use scientific observatories.” Future Generation Computer Systems , v.122 , 2020 , p.14

  • Fajardo, E., Arora, A., Davila, D., Gao, R., Würthwein, F. and Bockelman, B. “Systematic benchmarking of HTTPS third party copy on 100Gbps links using XRootD” arXiv preprint arXiv:2103.12116. , 2020

  • Sfiligoi, I. “Demonstrating 100 Gbps in and out of the public Clouds.” Practice and Experience in Advanced Research Computing. , 2020 

  • Sfiligoi, I., McDonald, D. and Knight, R. “Porting and optimizing UniFrac for GPUs: Reducing microbiome analysis runtimes by orders of magnitude.” Practice and Experience in Advanced Research Computing , 2020 

  • Fajardo, E., Bockelman, B. and Wuerthwein, F. “Testing the limits of HTTPS single point third party copy transfer over the WAN” EPJ Web of Conferences , v.245 , 2020 , p.04025

  • Ralph F.M., Wilson A.M., Shulgina T., Kawzenuk B., Sellars S., Rutz J.J., Lamjiri M.A., Barnes E.A., Gershunov A., Guan B., Nardi K.M., Osborne T., and Wick G.A. “ARTMIP-early start comparison of atmospheric river detection tools: how many atmospheric rivers hit northern California?s Russian River watershed?” Climate Dynamics , v.52 , 2019 , p.4973

  • Fajardo, E., Tadel, M., Balcas, J., Tadel, A., Würthwein, F., Davila, D., Guiang, J. and Sfiligoi, I. “Moving the California distributed CMS XCache from bare metal into containers using Kubernetes” EPJ Web of Conferences , v.245 , 2020

  • Gala K., Bryden P., Paolini C., Dimitrova A., and Sarkar M. “Real-time indoor geolocation tracking for assisted healthcare facilities” Wireless Telecommunications Symposium , 2019

  • Erdem, C., Bensman, E.M., Mutsuddy, A., Saint-Antoine, M.M., Bouhaddou, M., Blake, R.C., Dodd, W., Gross, S.M., Heiser, L.M., Feltus, F.A. and Birtwistle, M.R. “A Simple and Efficient Pipeline for Construction, Merging, Expansion, and Simulation of Large-Scale, Single-Cell Mechanistic Models.” bioRxiv , 2020

  • Sfiligoi, I., Schultz, D., Würthwein, F. and Riedel, B. “Pushing the Cloud Limits in Support of IceCube Science.” IEEE Internet Computing, , v.25 , 2021 , p.71

  • Sfiligoi, I., Graham, J. and Wuerthwein, F. “Characterizing network paths in and out of the clouds” EPJ Web of Conferences , v.245 , 2020 , p.07059

  • Ogle, C., Reddick, D., McKnight, C., Biggs, T., Pauly, R., Ficklin, S.P., Feltus, F.A. and Shannigrahi, S. “Named Data Networking for Genomics Data Management and Integrated Workflows.” Frontiers in big Data, 4 , 2021

  • Kansal, R., Duarte, J., Orzari, B., Tomei, T., Pierini, M., Touranakou, M., Vlimant, J.R. and Gunopulos, D. “Graph Generative Adversarial Networks for Sparse Data Generation in High Energy Physics” arXiv preprint , 2020 , p.arXiv:201

  • Yazdani, M., Nguyen, M.H., Block, J., Crawl, D., Zurutuza, N., Kim, D., Hanson, G., and Altintas, I. “Scalable Detection of Rural Schools in Africa using Convolutional Neural Networks and Satellite Imagery” IEEE/ACM International Conference on Utility and Cloud Computing Companion (UCC Companion) , 2018

  • Sellars S.L., et al. “The evolution of bits and bottlenecks in a scientific workflow trying to keep up with technology: Accelerating 4D image segmentation applied to NASA data.” In Progress , 2019

  • Gala, K., Bryden, P.D., Paolini, C., Wang, M., Mihovska, A.D. and Sarkar, M. “Real-time indoor geolocation tracking for assisted healthcare facilities” International Journal of Interdisciplinary Telecommunications and Networking (IJITN) , v.12 , 2020 , p.1

  • Altintas I., Marcus K., Nealey I., Sellars SL, Graham J., Mishin D., Polizzi J., Crawl D., DeFanti T., and Smarr L. “Workflow-driven distributed machine learning in CHASE-CI: A cognitive hardward and software ecosystem community infrastructure.” , 2019 , p.1903.0680

  • Spoor, S., Wytko, C., Soto, B., Chen, M., Almsaeed, A., Condon, B., Herndon, N., Hough, H., Jung, S., Staton, M. and Wegrzyn, J. “Tripal and Galaxy: supporting reproducible scientific workflows for community biological databases.” Database , 2020

  • Fajardo, E., Weitzel, D., Rynge, M., Zvada, M., Hicks, J., Selmeci, M., Lin, B., Paschos, P., Bockelman, B., Hanushevsky, A. and Würthwein, F. “Creating a content delivery network for general science on the internet backbone using XCaches” EPJ Web of Conferences , v.245 , 2020 , p.04041

  • Haberl, M.G., Wong, W., Penticoff, S., Je, J., Madany, M., Borchardt, A., Boassa, D., Peltier, S.T. and Ellisman, M.H “CDeep3M-Preview: Online segmentation using the deep neural network model zoo” bioRxiv , 2020 

  • Shields CA, Rutz J.J., Leung L-Y, et al. “ARTMIP-early start comparison of atmospheric river detection tools: how many atmospheric rivers hit northern California?s Russian River watershed?” Geoscientific Model Development , v.11 , 2018 , p.2455

  • Sfiligoi, I., Würthwein, F., Riedel, B. and Schultz, D. “Running a pre-exascale, geographically distributed, multi-cloud scientific simulation” International Conference on High Performance Computing , 2020 

  • Sfiligoi, I., McDonald, D. and Knight, R. “Accelerating key bioinformatics tasks 100-fold by improving memory access” arXiv preprint , 2021 , p.arXiv:210

  • Madany, M., Marcus, K., Peltier, S., Ellisman, M.H. and Altintas, I. “NeuroKube: An Automated and Autoscaling Neuroimaging Reconstruction Framework using Cloud Native Computing and AI.” 2020 IEEE International Conference on Big Data (Big Data) , 2020 , p.320

  • Gardner, R., Bryant, L., Neubauer, M., Wuerthwein, F., Stephen, J. and Chien, A. “The Scalable Systems Laboratory: a Platform for Software Innovation for HEP” EPJ Web of Conferences , v.245 , 2020 , p.05019

  • Nguyen, M.H., Abdelmaguid, E., Huang, J., Kenchareddy, S., Singla, D., Wilke, L., Bobar, M., Carruth, E.D., Uys, D., Altintas, I, Muse, E.E., Quer, G., and Steinhubl, S. “Analytics Pipeline for Left Ventricle Segmentation and Volume Estimation on Cardiac MRI using Deep Learning” IEEE 14th International Conference on e-Science , 2018 

  • Copps, E., Zhang, H., Sim, A., Wu, K., Monga, I., Guok, C., Würthwein, F., Davila, D. and Fajardo, E. “Analyzing scientific data sharing patterns for in-network data caching” arXiv preprint , 2021 , p.arXiv:210