Unfolding the universe of possibilities..

Navigating the waves of the web ocean

Embracing Julia: An Invitation Letter

Warmly Extended to Python Lovers, Scientific Computing Wizards and Data Scientists

Julia is a general-purpose, dynamic, high-performance and high-level programming language that is just-in-time compiled. It’s a fairly recent language with its major 1.0 release rolled out only in 2018. In this story, we aim to demonstrate that this language is absolutely worth adding to your arsenal if you are into data science, scientific computing or are just an avid Python user. It may be true that this is most beautiful programming language you will ever come across.

Galaxy with Purple, Green and Red Planets Digital Art — Generated by author using DALLE 2

In this story, we will go over the heights of ideas with Julia and why it’s worth learning. Once you are done, we highly recommend you check the next story From Python to Julia: An Ultimate Guide for an easy transition from Python to Julia.

Table of Contents

· Julia is High-level
Basic Syntax
Elegant Syntax for Mathematics
· Julia is Fast
The Two Language Problem
Julia is Just-in-time Compiled
· Julia Solves the Expression Problem
The Expression Problem
Multiple Dispatch
Abstract and Concrete Types
· Julia is Fully Featured
Array Support
String Support
Easy Integration with C Code
The Standard Library
· Julia is General Purpose
Automation and Scripting
· Julia is Extensively Extendible
· Wrapping Up

Photo by Daniele Levis Pelusi on Unsplash

Julia is High-level

The introduction already may have made you feel that this will be like Python — a general purpose, dynamic and high-level language as well. To verify, let’s get a taste how basic Julia code looks like compared to Python.

Basic Syntax

Consider the following guessing game in Python:

import random

def guessing_game(max):
random_number = random.randint(1, max)
print(f”Guess a number between 1 and {max}”)
while True:
user_input = input()
guess = int(user_input)
if guess < random_number:
print(“Too low”)
elif guess > random_number:
print(“Too high”)
print(“That’s right!”)


The following is the equivalent in Julia:

function guessing_game(max::Integer)
random_number = rand(1:100)
println(“Guess a number between 1 and $max”)
while true
user_input::String = readline()
guess = parse(Int, user_input)
if guess < random_number
println(“Too low”)
elseif guess > random_number
println(“Too high”)
println(“That’s right!”)


Main differences here are that Julia does not assume any indentation or require colons but instead requires an explicit “end” to end scopes for constructs such as if-conditions, loops and functions. You should feel right at home with this if you come from Matlab or Fortran.

Another difference that you may have noticed is that Julia naturally supports type annotations in variable declarations, function arguments (and return types, although rarely used). They are always optional but are generally used for type assertions, letting the compiler choose the right method instance to call when the same method is overloaded for multiple types and in some cases of variable and struct declaration, for performance benefits.

Elegant Syntax for Mathematics

# Elegant Expressions
x = 2
z = 2y + 3x – 5

# Official Unicode Support
α, β, γ = 1, 2, π/2

# one-line functions
f(r) = π*r^2

f'(3) # derivative (with Flux.jl package)

# Column vector is literally a column
v₁ = [1

v₂ = [1 2 3 4]

# transpose
println(v1′ == v2)

# This is literally a 3×3 matrix
M⁽ⁱ⁾ = [1 2 3
4 5 7
7 8 9]

# Explicit modeling of missingness
X = [1, 2, missing, 3, missing]

One serious edge that Julia has over Python is syntax support for mathematics. * need not be used when multiplying constants by variables, latex symbols are supported for variable names (may need to use a VSCode extension to convert pi to π, v_1 to v₁, etc.) and matrices in general respect the layout in the code definition.

For instance, if you were to implement gradient descent for a neural network.


In Python, you would probably write:

import numpy as np

# Gradient Descent in a Neural Network
J_del_B_n = [np.zeros(b) for b in B_n]
J_del_W_n = [np.zeros(W) for W in W_n]

for (x, y) in zip(x_batch, y_batch):
J_del_B_n_s, J_del_W_n_s = backprop(x, y)
J_del_B_n = [J_del_b + J_del_b_s for J_del_b,
J_del_b_s in zip(J_del_B_n, J_del_B_n_s)]
J_del_W_n = [J_del_W + J_del_W_s for J_del_W,
J_del_W_s in zip(J_del_W_n, J_del_W_n_s)]

d = len(x_batch)
W_n = [(1 – lambda_val * alpha / d) * W – lambda_val /
d * J_del_W for W, J_del_W in zip(W_n, J_del_W_n)]
B_n = [(1 – lambda_val * alpha / d) * b – lambda_val /
d * J_del_b for b, J_del_b in zip(B_n, J_del_B_n)]

Compare the readability of that to what you can write using Julia:

# Gradient Descent in a NN
მJⳆმBₙ = [zeros(b) for b in Bₙ]
მJⳆმWₙ = [zeros(W) for W in Wₙ]

for (x, y) in zip(x_batch, y_batch)
მJⳆმBₙₛ, მJⳆმWₙₛ = backprop(x, y)
მJⳆმBₙ = [მJⳆმb + მJⳆმbₛ for მJⳆმb, მJⳆმbₛ in zip(მJⳆმBₙ, მJⳆმBₙₛ)]
მJⳆმWₙ = [მJⳆმW + მJⳆმWₛ for მJⳆმW, მJⳆმWₛ in zip(მJⳆმWₙ, მJⳆმWₙₛ)]

d = len(x_batch)
Wₙ = [(1 – λ*α/d)* W – λ/d * მJⳆმW for W, მJⳆმW in zip(Wₙ, მJⳆმWₙ)]
Bₙ = [(1 – λ*α/d)* b – λ/d * მJⳆმb for b, მJⳆმb in zip(Bₙ, მJⳆმBₙ)]

You can try to write code like this in Python, but editors will often put yellow squares around the Unicode variables (or fail to highlight them) and your code may not work with third party packages such as Pickle.

Photo by Solaiman Hossen on Unsplash

Julia is Fast

Another major reason why Julia can be thought of as a Python dream come true is that, unlike Python, Ruby and other high-level languages, it does not compromise speed for being high-level. In fact, it can be as fast as low-level languages such as C and C++.


For reference, the following reports the performance of Julia, along with other languages on popular performance benchmarks:

Julia Microbenchmarks: Image via JuliaLang under MIT license

The Two Language Problem

A corollary to Julia’s performance is that is solves the two-language problem:

Research code (e.g., a machine learning model), is typically, written in a high-level language such Python because it’s high-level and interactive; hence, allows focusing more on the science (less code issues) and allows more exploration.Once research code is finalized, it must be rewritten in a low-level language such as C before it’s rolled out to production.

The issue here is that the same code has to be rewritten in more than one language. This is generally hard and error prone; consider if the research code is modified after it’s rolled out, in the worst case it will all have to be rewritten in the low-level language again.

One way to get around this issue is to write performance-critical libraries (e.g., Numpy) in low-level languages, such as C, then it’s possible to wrap them with Python functions that internally call the C ones which can be used for both research and production without worrying about performance. In reality, this is very limited because:

It makes it really hard for new developers to contribute or collaborate with novel scientific methods they have written, since they may need to rewrite those in a low-level language such as C for performance before exposing them in the high-level library.Hilarious constraints may be imposed on the developers of the high-level language in the scientific computing domain. For instance, writing explicit for loops may be heavily discouraged.

Julia solves the two-language problem by being high-level, interactive AND quite fast, even for production.

Julia is Just-in-time Compiled

There is a small note related to Julia’s performance. Because Julia is JIT compiled, the first run of any piece of Julia code will take more time to complete. During this time, every function code will be converted to native code (i.e., code that the processor can interpret) for the specific variable types inferred from the code. Once it does, it will cache the compiled representation so that if the function is called again with different inputs of same types, then it will be interpreted immediately.

To elaborate further, for a function with N arguments, there are possibly an exponential number of possible native code representations; one for every possible combination of types for the N arguments. Julia will compile the function down to the representation that corresponds to the types inferred from the code the first time the code is run. Once it does, further calls for the function will be effortless. Note that it does not necessarily use type annotations (which are optional and can have other purposes we mentioned) during type inference, types can be inferred from runtime values of the inputs.

This is not an issue because research code or code running on a server has to initially compile only once and once that’s done any further runs (real API calls or further experimentation) of the code are blazing fast.

Photo by Thom Milkovic on Unsplash

Julia Solves the Expression Problem

The Expression Problem

The expression problem is about being able to define a data abstraction that is extensible both in its representations (i.e., supported types) and its behaviors (i.e., supported methods). That is, a solution to the expression problem allows:

Adding new types to which existing operations applyAdding new operations to which existing types apply

without violating the open-closed principle (or causing other issues). This implies that it should be possible to add the new types without modifying the code for existing operations and it should be possible to add new operations without modifying the code for existing types.

Python, like many other programming languages, is object-oriented and fails to address the expression problem.

Suppose we have the following data abstraction:

# Base class
class Shape:
def __init__(self, color):

def area(self):

# Child class
class Circle(Shape):
def __init__(self, radius):
self.radius = radius

def area(self):
return 3.14 * self.radius * self.radius

It’s very easy to add new types to which existing methods should apply. Just inherit from the Shape base class. It does not require the modification of any existing code:

class Rectangle(Shape):
def __init__(self, width, height):
self.width = width
self.height = height

def area(self):
return self.width * self.height

Meanwhile, it’s not easy to add operations to which existing types apply. If we want to add a perimeter method, then we have to modify the base and every single child class implemented so far.

One consequence of this problem is that if package x is maintained by author X and it initially supports the set of operations Sx, and if another set of operations Sy is helpful to another set of developers Y, they must be able to modify the package by X to add these methods. In practice, developers Y just make another package on their own, possibly duplicating code from that in package x to implement the type because developer X may not be happy with more code to maintain and Sy may be a different genre of methods that doesn’t have to live in the same package.

On the other hand, because it’s easy to add new types for which existing operations apply, if developer Y rather wanted to just define a new type that implements operations in the type implemented by X, then they could easily do that without even needing to modify package x or duplicating any code in it. Just importing the type and then inheriting from it.

Multiple Dispatch

To solve the expression problem, which allows massive integration among different packages, Julia does away with traditional object-oriented programming completely. Instead of classes, Julia uses abstract type definitions, structs (custom type instances of abstract types) and methods and a technique called multiple dispatch that as we will see, perfectly solves the expression problem.

To see an equivalent of what we had above:

### Shape Abstract Type (Interface)

abstract type Shape end

function area(self::Shape) end

### Circle Type (Implements the Interface)

struct Circle <: Shape

function area(circle::Circle)
return 3.14 * circle.radius^2

Here we defined an abstract type “Shape”. The fact that it’s abstract implies that it cannot be instantiated; however, other types (classes) can subtype (inherit from) it. Afterwards, we defined a circle type, as a subtype of the Shape abstract type and we defined the area method while specifying that the input must be of type Circle. By this we can do

c = Circle(3.0)

This would print 28.26. Although, c satisfies both area definitions because it’s also a Shape, the second is more specific so it’s the one the compiler chooses for the call.

Similar to class-based OOP, it’s easy to add another type “rectangle” without touching existing code:

struct Rectangle <: Shape

function area(rect::Rectangle)
return rect.length * rect.width

And now when we do

rect = Rectangle(3.0, 6.0)

We get 18.0. This is multiple dispatch in action; the correct instance of the method area was dynamically dispatched based on the run-time type of the arguments. If you come from a C or C++ background, then this must remind you of function overloading. The difference is that function overloading is not dynamic, it relies on the types found during compile time. Thus, you can devise examples where its behavior is different.

More importantly, and unlike class-based OOP, we can add methods to any of Shape, Circle or Rectangle without needing to modify their files. If all the files above are in my package and you wish to add a set of methods that produce animations and 3D visuals of the geometric shapes (which I don’t care about), then all you need is to import my package. Now you can access the Shape, Circle and Rectangle types and you can write the new functions, then export them in your own “ShapeVisuals” package.

### Interface definitions
function animate(self::Shape) end
function ThreeDify(self::Shape) end

### Circle definitions
function animate(self::Circle)

function ThreeDify(self::Circle)


### Rectangle defintions
function animate(self::Rectangle)

function ThreeDify(self::Rectangle)


When you think about it, the major distinction between this and OOP that you know of is that it follows the pattern func(obj, args) instead of obj.func(args). As a bonus, it also makes things like func(obj1, obj2, args) a breeze. The other distinction is that it does not encapsulate methods and data together or impose any protection on them; perhaps, an irrelevant measure when developers are mature enough and code is reviewed anyway.

Abstract and Concrete Types

The fact that you now know that an abstract type is simply a type that you cannot instantiate values from but that other types can subtype, paves the way to discuss Julia’s type system. Recall that is optional, to use the syntax var::typeto annotate the types of variables upon declaration, as function arguments or returns.

Any type in Julia is either abstract, as we defined above, or concrete. Concrete types are those you can instantiate like the custom types we defined above.

Julia has the following hierarchical type system for numbers:

Julia Microbenchmarks: Image via Julia for Optimization and Learning under MIT license

If your function takes one argument and operates on any Number, you will use func(x::Number). This would only throw an error if non-numeric value, such as a string, is passed. Meanwhile, if it only works for any float then you would do func(x::AbstractFloat). No error will be thrown if the input is of type BigFloat, Float64, Floar32 or Floar16. Because multiple dispatch exists, you can also define another instance of the function func(x::Integer) to handle the case when the given number is an integer.

Julia similarly has a hierarchical type system for other abstract types such as AbstractString but they are much simpler.

Photo by Paul Melki on Unsplash

Julia is Fully Featured

If you think about it, Python only comes with bare bone functionality out of the box. For instance, you can do very little in data science and scientific computing if you are using Python only without popular packages such as Numpy. The vast majority of other packages in the field also heavily depend on Numpy. They all use and assume the “Numpy” array type (instead of the default Python list type) just as if it’s part of the language.

Julia isn’t like that. It comes with many important features out-of-the-box, including:

Array Support

Julia comes with array support similar to Numpy out-of-the-box which includes broadcasting and vectorization support. For instance, the following compares popular Numpy operations with how you would write them natively in Julia:

#> 1. Creating a NumPy array
### Python
arr = np.array([[1, 2, 3],
[4, 5, 6],
[7, 8, 9]])
### Julia
arr = [1 2 3
4 5 6
7 8 9]

#> 2. Getting the shape of an array
### Python
shape = arr.shape
### Julia
shape = size(arr)

#> 3. Reshaping an array
### Python
reshaped_arr = arr.reshape(3, 3)
### Julia
reshaped_arr = reshape(arr, (3, 3))

#> 4. Accessing elements by index
### Python
element = arr[1, 2]
### Julia
element = arr[1, 2]

#> 5. Performing element-wise arithmetic operations
### Python
multiplication = arr * 3
### Julia
multiplication = arr .* 3

# 6. Array concatenation
### Python
arr1 = np.array([[1, 2, 3]])
arr2 = np.array([[4, 5, 6]])
concatenated_arr = np.concatenate((arr1, arr2), axis=0)
### Julia
arr1 = [1 2 3]
arr2 = [4 5 6]
concatenated_arr = vcat(arr1, arr2)

#> 7. Boolean masking
### Python
mask = arr > 5
masked_arr = arr[mask]
### Julia
mask = arr .> 5
masked_arr = arr[mask]

#> 8. Calculating the sum of array elements
### Python
mean_value = arr.sum()
### Julia
mean_value = sum(arr)

String Support

Julia also comes with extensive support for strings and regular expressions out-of-the-box:

name = “Alice”
age = 13

## concatentation
greeting = “Hello, ” * name * “!”

## interpolation
message2 = “Next year, you will be $(age + 1) years old.”

## regex
text = “Here are some email addresses: al******@gm***.com

# Define a regex for emails
email_pattern = r”[w.-]+@[w.-]+.w+”

# match emails
email_addresses = match(email_pattern, text)

“aby” > “abc” # true

When strings are compared, those later in the lexicographical order (general alphabetical order) are considered greater than those that show up earlier in the order. It can be shown that most of what you can do with strings in advanced string processing languages such as Perl, can be also done in Julia.


The fact that Python does not support true parallel multi-threading is justified by that it comes with a Global Interpreter Lock (GIL). This disallows running the interpreter to run multiple threads at the same time as an overly easy solution to guarantee thread-safety. It’s only possible to switch between multiple threads (e.g., if a server thread is busy waiting for a network request, the interpreter can switch to another thread).

Luckily, it’s not hard to release this lock in C programs called by Python which explains why Numpy is possible. However, if you have a massive computing for loop, then you can’t write Python code that executes it in parallel to speed up computation. The sad reality for Python is that the vast majority of mathematical operations that apply to large structures of data such as matrices, are parallelizable.

Meanwhile, in Julia true parallel multi-threading is natively supported out-of-the-box and it’s as easy as doing this:

# Before multi-threading
for i in eachindex(x)
y[i] = a * x[i] + y[i]

# After multi-threading
Threads.@threads for i in eachindex(x)
y[i] = a * x[i] + y[i]

When you run the code, you get to specify how many threads you want to use among the available ones in your system.

Easy Integration with C Code

The process of calling C code from Julia is officially supported out-of-the-box and can be done more efficiently and more easily than in Python. If you want to call

#include <stdio.h>

int add(int a, int b) {
return a + b;

then the main step (after a small setup) to call this function in Julia is writing

# specify function, return type, arg types and input. Prefix types with “C”
result = ccall(add, Cint, (Cint, Cint), 5, 3)

It’s far tricker to do this in Python and it can be less efficient. Especially because it’s much easier to map Julia types and structures to those in C.

A major consequence of this is that it’s possible to run the vast majority of languages that can output object C code here in Julia. Typically, external well-known packages exist for those. For instance, to call Python code you can use the PyCall.jl package as follows:

using PyCall

np = pyimport(“numpy”)

# Create a NumPy array in Python
py_array = np.array([1, 2, 3, 4, 5])

# Perform some operations on the NumPy array
py_mean = np.mean(py_array)
py_sum = np.sum(py_array)
py_max = np.max(py_array)

Almost no prior setup is needed for this besides installing the package. It’s likewise possible to call functions written in Fortran, C++, R, Java, Mathematica, Matlab, Node.js, and more using similar packages.

On the other hand, it’s possible to call Julia from Python, although not in an as elegant fashion. This has been probably used before to speed up functions without resorting to implementing them in C.

The Standard Library

A set of packages come pre-installed (but have to be explicitly loaded) with Julia. This includes the Statistics and LinearAlgebra packages, the Downloads package to access the internet, and more importantly the Distribued package for distributed computing (like Hadoop), also the Profile package for profiling (help optimizing code) and notably the Tests package for unit testing and the Pkg package for package management along with many others.

I must say that I am an avid Python user that has developed multiple packages in Python. There is no comparison between the third-party package “Setuptools” in Python and Pkg in Julia which is really much cleaner and easier to use. I was never able to comprehend why Python does not have its own package management and testing tools. These are really basic needs in a programming language.

Photo by Tom M on Unsplash

Julia is General Purpose


If you have encountered Julia in the past, then it would be natural to hear that you think Julia is a domain-specific language where scientific computing is the domain. It’s true that Julia has been carefully designed to be expressive and efficient for scientific computing but that does not stop it from being a general-purpose language. It’s just one built with scientific computing in mind. There are whatsoever degrees to which a language can be general purpose. For instance, Julia can be used for data science and machine learning, web development, automation and scripting, robotics aside from scientific computing, but there are still no mature packages that help developers use Julia for things like game development similar to Pygame in Python. Even if the Julia package Genie.jl is very close to be on par with Flask, it may fall short from more comprehensive frameworks like Django. In short, even if Julia is not as general-purpose as you want it to be at the moment, it’s built with that in mind and is expected to be eventually.

Automation and Scripting

Having mentioned that Julia can be used for automation and scripting, it’s worth to point out that it helps do so with elegant shell-like syntax.

For instance, here is a set of file system and process operations you can perform in Julia:

# Create a directory

# Change the working directory

# List files in the current directory

# Remove the directory
rm(“my_directory”; recursive=true)

# Check if a file exists
if isfile(“my_file.txt”)
println(“File exists.”)
println(“File does not exist.”)

# Run a simple shell command from Julia
run(`echo “Hello, Julia!”`)

# Capture the output of a shell command
result = read(`ls`, String)
println(“Contents of the current directory: $result”)

Notice the similarity to what you actually write in the terminal.

An Alternative to Starry Night Digital Art — Generated by author using DALLE 2

Julia is Extensively Extendible


One beautiful feature in the LISP programming language is that it is homoiconic. Meaning that code can be treated just like data and hence, new features and semantics could be added to the language by ordinary developers. Julia was also built to be homoiconic. For instance, remember that I said that Julia supports multiple dispatch only. Well, it looks like someone has made a ObjectOriented.jl package that allows developers to write OOP in Julia. As another example, if you make any new type, it’s easy to overload base functions and operators (which are just functions) to work with your new type.


Julia’s support for macros is a major reason why this is possible. You can think of a macro as a function that returns the code to be executed during the parse time of the program. Suppose you define the following macro:

macro add_seven(x)
$x + 7

Similar to a function, this allows you to call it in this way:

x = 5
@add_seven x

which returns 12. What happens under the hood is that during parse time (before compilation) the macro executes, returning the code 5 + 7 which during compile time evaluates to 12. You can think of macros are just way to dynamically perform CTRL+H (find and replace) operations.

For another practical use case, suppose you have a package with 10 useful methods, and you want to add a new interface to the package which means you have to write 10 structs, one for each method. Suppose it’s systematic to write any of the structs given the corresponding function, then you can simply write a single macro where you loops over the 10 functions to generate code for the 10 structs. In effect, the code you write will be equivalent to writing a single struct in a generic manner, so this saves time.

The fact that macros are possible allows for much more magic. For instance, if you recall above, we were able to multi-thread a for loop using the Threads.@threads macro. To measure the execution time of a function call all you do is @time func() if you are using the BenchmarkTools package then @benchmark func() would call the function many times to return statistics about the time and even a small plot. If you know what memoization is, even that can be applied to any function with a simple @memoize macro. There is no need to modify it in anyway. There is even @code_native func() which would show you the native code generated by the function and there are other macros that show you other representations of the code during the compilation process.

Wrapping Up

It turns out that all the languages features we have talked about were initially part of the plan for Julia. As stated on Julia’s website, this is the language’s vision:

“We want a language that’s open source, with a liberal license. We want the speed of C with the dynamism of Ruby. We want a language that’s homoiconic, with true macros like Lisp, but with obvious, familiar mathematical notation like Matlab. We want something as usable for general programming as Python, as easy for statistics as R, as natural for string processing as Perl, as powerful for linear algebra as Matlab, as good at gluing programs together as the shell. Something that is dirt simple to learn, yet keeps the most serious hackers happy. We want it interactive, and we want it compiled.”

Having read the story, you should more or less be able to reflect on every word mention in the vision statement at this point.

I hope that reading this has helped you learn more about the Julia language and that you will consider learning the language. Till next time, au revoir.

Embracing Julia: An Invitation Letter was originally published in Towards Data Science on Medium, where people are continuing the conversation by highlighting and responding to this story.

Leave a Comment