Skip to content

Commit 727fd41

Browse files
authored
Merge pull request #162 from dotChris90/master
Extend doc and generated new API docs
2 parents 9b8c0f2 + 3c4741e commit 727fd41

33 files changed

+16477
-56
lines changed

GenerateDoc.ps1

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
2+
docfx metadata ./docfx_project/docfx.json
3+
4+
docfx build ./docfx_project/docfx.json -o ./docs

docfx_project/articles/intro.md

Lines changed: 68 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
The following pages are for the users who want to use NumSharp.
44

55
Before you read the code examples you should read this page which explain some basis concepts.
6-
An other reference can be numpy since we try our best to follow their APIs.
6+
An other reference can be numpy since we try our best to follow their APIs (**High level - not lower level**).
77

88
## NDArray, NDStorgage and Shape
99

@@ -27,20 +27,83 @@ NumSharp brings its own tensor / array type called **NDArray**.
2727
So now the question - .NET offers already multi-dimensional arrays - why a new array type?
2828

2929
NumSharps NDArray offers the capability of storing any tensor (independent of dimension!) into its internal storage.
30-
So NumSharps NDArray can store a vector, a matrix or sth with dimension 5 and higher. This is not possible with .NET arrays since each tensor type is a different class.
30+
So NumSharps NDArray can store a vector, a matrix or sth with dimension 5 and higher. This is not possible with .NET arrays since each tensor type is a different class. This offers users the possibility to use same methods for different tensor types.
3131

3232
Now the next question - how a NDArray can do this?
3333

34-
First of all we need to be a little bit more abstract. Why we use tensors? Because we want to store data and we want to get them. How we get and set them? We get and set via indexes (which are always integers). So just this data are important and the corresponding indexes.
34+
First of all we need to be a little bit more abstract. Why we use tensors? Because we want to store data and we want to get them. How we get and set them? We get and set via indexes (which are always integers). So just this data are important and the corresponding indexes. That's it. Data + Indexes. :)
3535

3636
With this in mind we easily can understand the NDStorage of NumSharp.
3737

38-
NDStorage is an object which stores the data of a tesor in a single 1D array. Since it is a 1D array independend of the tensor dimension NDStorage can be used for all kind of tensors.
38+
NDStorage is an object which stores the data of a tesor in a single 1D array. Since it is a 1D array independend of the tensor dimension NDStorage can be used for all kind of tensors. A vector is stored inside a 1D array, a matrix, a 3 dimensional tensor and so on.
3939

40-
**But hold on! How the data comes into this 1D array?**
40+
**But hold on! How the data comes into this 1D arrayand how we get them back?**
4141

4242
NDStorage has a property called "shape". The shape is a small but important class in NumSharp. It stores the dimensions and most important! it determines which element in the 1D array is selected by given indexes.
4343

44+
**Vector**
45+
46+
Imagine a 1D tensor (a vector). Here it is easy because you can access the data with a single index like 'a = np[idx]'. The internal data store in NDStorage is a 1D array - so index to access is the same index in internal storage.
47+
48+
**Matrix**
49+
50+
Here it is a little bit more tricky. Each data element is stored by 2 indexes like np[idx,jdx] = 5. The internal storage is a 1D array so .... there must be a way to map the 2 indexes [idx,jdx] at NDArray level to a single index [kdx] in NDStorage level.
51+
52+
Indeed there is!
53+
54+
Not just in NumSharp but also in many other frameworks, libs or (general spoken) languages it is good style to store the elements of a matrix row wise or column wise into a 1D array. For a more professional description you can check https://en.wikipedia.org/wiki/Row-_and_column-major_order. Row wise Layout and column wise layout often also called row major and column major.
55+
56+
General spoken when imagine a matrix as a table - Row wise means that you start with element [0,0] (as your first element in 1D array) and take elements from columns of 1st row (and store them in the 1D array) until all elements of the 1st row are stored inside the 1D array. You go on with the 2nd row - take element [1,0],[1,1],[1,2],...,[1,n-1]. Go on with this pattern until all elements are inside the 1D array.
57+
58+
Column wise also starts with the element [0,0] but! it stays in the 1st column and takes elements along the rows until all elements from 1st column is stored. Repeat this with 2nd column, 3rd and so on.
59+
60+
The image below (taken from https://en.wikipedia.org/wiki/File:Row_and_column_major_order.svg) shows again the 'algorithm' for storing data from matrix to vector.
61+
62+
63+
![Row Wise Column Wise](../images/rowWise_ColumnWise.png)
64+
65+
66+
67+
**N dim tensor**
68+
69+
Now we come to the most tricky question - how to store a general n dimensional tensor inside a 1D array.
70+
71+
Short anwser - exactly like a matrix - just more generalized.
72+
73+
First we look again the row wise order.
74+
75+
[0,0] -> [0,1] -> [0,2] -> [0,3] -> [0,n-1] -> [1,0] -> [1,1] -> [1,2] -> [1,3] -> ...
76+
77+
So here we stay in one dimension (the first / rows) and fill the other dimensions until the dimension is full.
78+
After we switch to the next higher level of dimension (so change to next row).
79+
80+
For higher dimensions like 3D - NumSharp follow this pattern.
81+
82+
[0,0,0] -> [0,0,1] -> [0,0,2] -> [0,0,3] -> [0,0,n-1] -> [0,1,0] -> [0,1,1] -> [0,1,n-1] -> [0,2,0] -> [0,2,n-1] -> [0,m-1,0] -> ...
83+
84+
General spoken - you can image it as a **backward filling layout**.
85+
86+
As you can see the dimensions are filled beginning from last dimension, if one dimension is full, the dimension before is increased.
87+
88+
Next we look the column wise order.
89+
90+
[0,0] -> [1,0] -> [2,0] -> [3,0] -> [n-1,0] -> [0,1] -> [1,1] -> [2,1] -> [3,1] -> ...
91+
92+
Again we stay in one dimension but here in the last / column. The rows are filled until the 1st column is full and next dimension is increased.
93+
94+
So fill first dimension, increase next, fill again, etc. also in n dimensional tensor.
95+
96+
[0,0,0] -> [1,0,0] -> [2,0,0] -> [3,0,0] -> [n-1,0,0] -> [0,1,0] -> [1,1,0] -> [n-1,1,0] -> [0,2,0] -> [n-1,2,0] -> [0,m-1,0] ->
97+
98+
And this you can imagine as **forward filling layout**.
99+
100+
101+
102+
103+
104+
105+
106+
44107

45108

46109

docfx_project/docfx.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
{
66
"src": "../",
77
"files": [
8-
"src/**.csproj"
8+
"src/NumSharp.Core/NumSharp.Core.csproj"
99
]
1010
}
1111
],
48.6 KB
Loading

0 commit comments

Comments
 (0)