I am implementing ndarray in pascal

Friday, January 10, 2020

I love pascal. But I also love numpy's ndarray. Most of my project coded in python will most likely use numpy. There is no good solution (at least to me) for realizing ndarray functionality in pascal. Thus, I am trying to implement one on my own (here). So be it.

Meet noe

I am not trying to be pretentious by competing with numpy. Heck, nobody is trying to compete with numpy. This is merely my experimental project to understand the mechanism behind n-dimensional array: how is its layout in memory, how to do indexing, etc.

logo

Furthermore, I am also learning what happens under the hood of the automatic differentiation for a neural network in near future. So, this project might be a good basis for it. Noe (In korean: 뇌, means “brain”) is developed towards the implementation of neural networks. Therefore the name.

What can it accomplish?

As for now, not much. It is really minimum at its current form. But let me show you what I've learnt (and thus, implemented):

Declaring and initializing tensors

uses
  noe.core, // --> main unit
  noe.mat   // --> extending standard math unit

var
  A, B, C: TTensor; 

A tensor filled with a specific value:

{ 2x3 tensor filled with 1 }
A := FullTensor([2, 3], 1);
PrintTensor(A);
[[1.00, 1.00, 1.00]
 [1.00, 1.00, 1.00]]

A tensor of random values:

{ 2x2x3 tensor filled with randomized values }
B := FullTensor([2, 2, 3]);
PrintTensor(B);
[[[0.83, 0.03, 0.96]
  [0.68, 0.91, 0.04]]

 [[0.45, 0.83, 0.70]
  [0.56, 0.31, 0.77]]]

Multidimensional indexing

Accessing the value of a tensor using a multidimensional index:

{ Get the value of tensor A at index 0,1 }
WriteLn(A.GetVal([0, 1])); // will give 1

{ Get the value of tensor B at index 1,0,2 }
WriteLn(B.GetVal([1, 0, 1])); // will give 0.83

A tensor with specified values:

{ 2x3 tensor filled with values specified }
C := FullTensor(
  [2, 3],      //--> target shape

  [1., 2., 3., //--> the data
   4., 5., 6.] //
);
PrintTensor(C);

WriteLn;

{ Reshape C into a 6x1 tensor }
C.Reshape([6, 1]);
PrintTensor(C);
[[1.00, 2.00, 3.00]
 [4.00, 5.00, 6.00]]

[[1.00]
 [2.00]
 [3.00]
 [4.00]
 [5.00]
 [6.00]]

Some basic element-wise arithmetical operations are also supported:

A := FullTensor([3, 3], 1);
B := FullTensor([3, 3]);
WriteLn('A:');
PrintTensor(A);
WriteLn('B:');
PrintTensor(B);

WriteLn('A + B:');
PrintTensor(A + B);

WriteLn('A - B:');
PrintTensor(A - B);

WriteLn('A * B:');
PrintTensor(A * B);

And some others:

A := FullTensor([3,3]) + FullTensor([3, 3], 1);
PrintTensor( Log10(A) );
PrintTensor( Log2(A) );

A := FullTensor(
  [2, 2],

  [ 0., 30.,
   45., 90.]
);

A := DegToRad(A); // Also check RadToDeg(A)
PrintTensor( Sin(A) );
PrintTensor( Cos(A) );
PrintTensor( Tan(A) );

A := FullTensor(
  [2, 2],
  [1., 2.,
   3., 4.]
);
A := A ** 2;
PrintTensor(A); 

Please check noe.math.pas for more covered functionalities.

So what's next?

As I said, I am not trying to be ambitious. I will just enjoy the learning process. Thus, as for now, I don't really have a long todo-list. I implement what I think needs to be implemented. Let me hear you if you want to lend me a hand.

On the previous attempt…

I attempted a solution for a general purpose machine learning framework before. It is called darkteal. It is in the same repository as noe, just in the different branch. You can access it here. I learned a lot from developing it. From managing data representation (in a 2D array), loading data, performing matrix arithmetical operations, interfacing with existing computational library, etc. Even I managed to create some machine learning models out of it…

logo
And creating a plotting routine by interfacing gnuplot to display a nice figure...
logo
However, I found that the approach is not scalable as the data representation was just a 2D matrix. I want it to be more flexible, with arbitrary dimension. Switching to ndarray means changing the basis. Thus, I started `noe`. `Darkteal` is not abandoned. It is upgraded 🙈. Even I took parts of it for `noe`.

It means I am starting from scratch? Yes. But I am learning and I enjoy it. Nothing beats the joy of solving challenging puzzles… ☕

#development#machine learning

Noe now supports multidimensional indexing

Hello fellow mortals. I started a blog.

comments powered by Disqus