Skip to content

Commit

Permalink
Merge pull request #62 from BerkeleyLab/copyrights
Browse files Browse the repository at this point in the history
Add copyright statements
  • Loading branch information
rouson authored Jun 11, 2023
2 parents 1fd4a06 + 77f59ae commit ada289d
Show file tree
Hide file tree
Showing 11 changed files with 21 additions and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ The novel features of Inference-Engine include
2. Gathering network weights and biases into contiguous arrays
3. Runtime selection of inferences strategy and activation strategy.

Item 1 facilitates invoking Inference-Engine's `infer` function inside Fortran's `do concurrent` constructs, which some compilers can offload automatically to graphics processing units (GPUs). We envision this being useful in applications that require large numbers of independent inferences or networks to to train. Item 2 exploits the special case where the number of neurons is uniform across the network layers. The use of contiguous arrays facilitates spatial locality in memory access patterns. Item 3 offers the possibility of adaptive inference method selection based on runtime information. The current methods include ones based on intrinsic functions, `dot_product` or `matmul`. Future options will explore the use of OpenMP and OpenACC for vectorization, multithreading, and/or accelerator offloading.
Item 1 facilitates invoking Inference-Engine's `infer` function inside Fortran's `do concurrent` constructs, which some compilers can offload automatically to graphics processing units (GPUs). We envision this being useful in applications that require large numbers of independent inferences or or multiple networks to train concurrently. Item 2 exploits the special case where the number of neurons is uniform across the network layers. The use of contiguous arrays facilitates spatial locality in memory access patterns. Item 3 offers the possibility of adaptive inference method selection based on runtime information. The current methods include ones based on intrinsic functions, `dot_product` or `matmul`. Future options will explore the use of OpenMP and OpenACC for vectorization, multithreading, and/or accelerator offloading.

Downloading, Building and Testing
---------------------------------
Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine/expected_outputs_m.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
module expected_outputs_m
use kind_parameters_m, only : rkind
implicit none
Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine/expected_outputs_s.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
submodule(expected_outputs_m) expected_outputs_s
implicit none

Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine/input_output_pair_m.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
module input_output_pair_m
use expected_outputs_m, only : expected_outputs_t
use kind_parameters_m, only : rkind
Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine/input_output_pair_s.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
submodule(input_output_pair_m) input_output_pair_s
implicit none

Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine/kind_parameters_m.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
module kind_parameters_m
implicit none
private
Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine/mini_batch_m.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
module mini_batch_m
use input_output_pair_m, only : input_output_pair_t
use kind_parameters_m, only : rkind
Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine/mini_batch_s.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
submodule(mini_batch_m) mini_batch_s
implicit none

Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine/network_increment_m.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
module network_increment_m
use kind_parameters_m, only : rkind
implicit none
Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine/network_increment_s.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
submodule(network_increment_m) network_increment_s
use assert_m, only : assert
implicit none
Expand Down
2 changes: 2 additions & 0 deletions src/inference_engine_m.f90
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
! Copyright (c), The Regents of the University of California
! Terms of use are as specified in LICENSE.txt
module inference_engine_m
use activation_strategy_m
use concurrent_dot_products_m
Expand Down

0 comments on commit ada289d

Please sign in to comment.