![]() Yes, it is convenient and powerful, but it has a limited number of algorithms and sometimes you need to implement your own custom algorithm. However, Spark is just yet another framework for large scale data analytics. This language allows to start feeling the full power of Spark comprising analytics, streaming and graph processing tools. Databricks allowed to forget about the problems related to setting up and maintaining the environment.Įveryone who is learning and using Spark eventually realizes that Python API is not as powerful and flexible as the core language of the framework - Scala. Then I have realized that I wanted more and running notebooks locally was not enough for me, so in 2015, I signed up for Databricks Community Ediditon subscription. ![]() I use Jupyter almost every day and, as many others, when I first started learning Spark I developed my first data analysis pipelines using interactive notebooks and Python API. It is really useful when I want to present some code, let someone reproduce my research or just learn how to use new tools and libraries.
0 Comments
Leave a Reply. |