Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
    • Yandex Cloud Partner program
  • Blog
  • Pricing
  • Documentation
© 2025 Direct Cursus Technology L.L.C.
Yandex Managed Service for ClickHouse®
  • Getting started
    • All tutorials
    • Adding data to the database
    • Migrating data to Managed Service for ClickHouse® using ClickHouse®
    • Migrating data to Managed Service for ClickHouse® using Data Transfer
    • Sharding tables
    • Data resharding in a cluster
    • Using a hybrid storage
    • Fetching data from Managed Service for Apache Kafka®
    • Fetching data from RabbitMQ
    • Exchanging data with Yandex Data Processing
    • Configuring Yandex Cloud DNS for cluster access from other cloud networks
    • Analyzing Yandex Object Storage logs in Yandex DataLens
    • Configuring Managed Service for ClickHouse® for Graphite
    • Saving a Yandex Data Streams data stream in Managed Service for ClickHouse®
    • Migrating a database from Google BigQuery
    • Delivering data from Managed Service for Apache Kafka® using Yandex Data Transfer
    • Migrating data from Yandex Direct using Yandex Cloud Functions, Yandex Object Storage, and Yandex Data Transfer
    • Loading data from Yandex Object Storage to Managed Service for ClickHouse® using Yandex Data Transfer
    • Migrating a database from Greenplum® to ClickHouse®
    • Migrating a database from MySQL® to ClickHouse® using Yandex Data Transfer
    • Asynchronously replicating data from PostgreSQL to ClickHouse®
    • Loading data from Yandex Managed Service for YDB to Managed Service for ClickHouse® using Yandex Data Transfer
    • Copying data from Managed Service for OpenSearch to Managed Service for ClickHouse® using Yandex Data Transfer
    • Entering data into storage systems
    • Using parameters
    • Examples of creating QL charts
    • Web analytics with funnels and cohorts calculated based on Yandex Metrica data
    • AppMetrica: direct connection
    • AppMetrica: data export, post-processing, and visualization
    • Loading data from Yandex Metrica to a ClickHouse® data mart
    • Yandex Tracker: data export and visualization
    • Retail chain's dashboard based on a ClickHouse® DB
    • Analyzing sales and locations of pizzerias based on data from the ClickHouse® database and Marketplace
    • Geocoding with the Yandex Maps API for data visualization in DataLens
    • Importing data from Object Storage, processing and exporting to Managed Service for ClickHouse®
    • Working with data using Query
    • Federated data queries using Query
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Audit Trails events
  • Public materials
  • Release notes

In this article:

  • Getting started
  • Required paid resources
  • Get the Geocoder API key
  • Convert your data to DataSphere
  • Create a project
  • Create a secret
  • Create a notebook
  • Install dependencies
  • Install certificates
  • Upload and convert your data
  • Create a connection to the file in DataLens
  • Create a dataset based on the connection
  • Create a chart
  • How to delete the resources you created
  1. Tutorials
  2. Geocoding with the Yandex Maps API for data visualization in DataLens

Geocoding with the Yandex Maps API for data visualization in DataLens

Written by
Yandex Cloud
Updated at March 7, 2025
  • Getting started
    • Required paid resources
  • Get the Geocoder API key
  • Convert your data to DataSphere
    • Create a project
    • Create a secret
    • Create a notebook
    • Install dependencies
    • Install certificates
    • Upload and convert your data
  • Create a connection to the file in DataLens
  • Create a dataset based on the connection
  • Create a chart
  • How to delete the resources you created

In this tutorial, you will learn how to convert addresses to geo-coordinates using the Geocoder API and visualize data in DataLens. Data is processed using Python scripts in Jupyter Notebooks in Yandex DataSphere.

We will use data from a ClickHouse® demo database as the data source.

  1. Get your cloud ready.
  2. Get the Geocoder API key.
  3. Convert your data to DataSphere.
  4. Create a connection to the file in DataLens.
  5. Create a dataset based on the connection.
  6. Create a chart.

If you no longer need the resources you created, delete them.

Getting startedGetting started

Before getting started, register in Yandex Cloud, set up a community, and link your billing account to it.

  1. On the DataSphere home page, click Try for free and select an account to log in with: Yandex ID or your working account with the identity federation (SSO).
  2. Select the Yandex Cloud Organization organization you are going to use in Yandex Cloud.
  3. Create a community.
  4. Link your billing account to the DataSphere community you are going to work in. Make sure you have a linked billing account and its status is ACTIVE or TRIAL_ACTIVE. If you do not have a billing account yet, create one in the DataSphere interface.

Tip

To make sure Yandex DataLens and Yandex DataSphere can run within the Yandex Cloud network, create their instances in the same organization.

Required paid resourcesRequired paid resources

The infrastructure deployment cost includes a fee for using DataSphere computing resources.

Get the Geocoder API keyGet the Geocoder API key

Get a key required to use the Geocoder API:

  1. Go to the Developer dashboard and click Connect APIs.

    image

  2. In the window that opens, select JavaScript API and Geocoder HTTP API and click Continue.

  3. Fill out the form and click Continue.

  4. In the window that opens, click Go to API.

  5. Under API keys, copy the value of the key.

    image

Convert your data to DataSphereConvert your data to DataSphere

Create a projectCreate a project

  1. Open the DataSphere home page.
  2. In the left-hand panel, select Communities.
  3. Select the community to create a project in.
  4. On the community page, click Create project.
  5. In the window that opens, enter a name and description (optional) for the project.
  6. Click Create.

Create a secretCreate a secret

Create a secret to store the Geocoder API key:

  1. Under Project resources on the project page, click Secret.
  2. Click Create.
  3. In the Name field, enter the name for the secret: API_KEY.
  4. In the Value field, enter the key value.
  5. Click Create. You will see a page with detailed info on the secret you created.

Create a notebookCreate a notebook

  1. Select the relevant project in your community or on the DataSphere homepage in the Recent projects tab.

  2. Click Open project in JupyterLab and wait for the loading to complete.

  3. In the top panel of the project window, click File and select New → Notebook.

  4. Select DataSphere Kernel and click Select.

    image

Install dependenciesInstall dependencies

  1. Paste the code given below into the notebook cell and click :

    %pip install requests
    %pip install clickhouse-driver
    
  2. Restart the kernel by clicking Kernel → Restart Kernel in the top panel of the project window.

Install certificatesInstall certificates

Install certificates into the project's local storage:

#!:bash
mkdir --parents /home/jupyter/datasphere/project/Yandex/

wget "https://storage.yandexcloud.net/cloud-certs/RootCA.pem" \
     --output-document /home/jupyter/datasphere/project/Yandex/RootCA.crt

wget "https://storage.yandexcloud.net/cloud-certs/IntermediateCA.pem" \
     --output-document /home/jupyter/datasphere/project/Yandex/IntermediateCA.crt

Upload and convert your dataUpload and convert your data

  1. Create a class to work with the Geocoder API:

    import requests
    from dataclasses import dataclass
    
    @dataclass
    class YandexGeocoder:
        api_key: str
        geocoder_url: str = 'https://geocode-maps.yandex.ru/1.x'
    
        def adress_to_geopoint(self, address: str) -> str:
    
            # Converting an address to geo-coordinates in DataLens format
    
            response = requests.get(self.geocoder_url, params={
                'apikey': self.api_key,
                'geocode': address,
                'format': 'json',
            })
            response.raise_for_status()
    
            result = response.json()['response']['GeoObjectCollection']['featureMember']
            if not result:
                return None
    
            lat, lon = result[0]['GeoObject']['Point']['pos'].split(' ')
            return self._to_datalens_format(lon, lat)
    
        def _to_datalens_format(self, lon, lat):
            return f'[{lon},{lat}]'
    
  2. Connect to the ClickHouse® demo DB:

    from clickhouse_driver import Client
    
    ch_client = Client(
        host='rc1a-ckg8nrosr2lim5iz.mdb.yandexcloud.net',
        user='samples_ro',
        password='MsgfcjEhJk',
        database='samples',
        port=9440,
        secure=True,
        verify=True,
        ca_certs='/home/jupyter/datasphere/project/Yandex/RootCA.crt'
    )
    
  3. Run a check using this command:

    print(ch_client.execute('SELECT version()'))
    

    If the connection is successful, the terminal will display the ClickHouse® version number.

  4. Export data from the table with shop addresses into the ch_data variable:

    ch_data = ch_client.execute('SELECT ShopName, ShopAddress FROM MS_Shops')
    ch_data
    
  5. Convert the addresses from the ShopAddress column into geo-coordinates:

    import os
    
    geocoder = YandexGeocoder(api_key=os.environ['API_KEY'])
    
    encoded_data = [
        (name, geocoder.adress_to_geopoint(adress))
        for name, adress in ch_data
    ]
    encoded_data
    
  6. Save the resulting data to a file:

    import csv
    import sys
    
    csv_writer = csv.writer(
        sys.stdout,
        delimiter=',',
        quotechar='"',
        quoting=csv.QUOTE_MINIMAL,
    )
    
    filename = 'encoded_data.csv'
    
    with open(filename, 'w') as f:
        csv_writer = csv.writer(
            f,
            delimiter=',',
            quotechar='"',
        )
        csv_writer.writerows(encoded_data)
    

    You will see the encoded_data.csv file in the left-hand panel.

    image

  7. Download the file: right-click it and select Download.

Create a connection to the file in DataLensCreate a connection to the file in DataLens

  1. Go to the DataLens home page.

  2. In the left-hand panel, select Connections and click Create connection.

  3. Under Files and services, select the Files connection.

  4. Click Upload files and select the encoded_data.csv file.

    image

  5. In the top-right corner, click Create connection.

  6. Enter geocoder_csv for the connection name and click Create.

Create a dataset based on the connectionCreate a dataset based on the connection

  1. In the top-right corner, click Create dataset.

  2. Go to the Fields tab.

  3. Rename the fields as follows:

    • field1 to Shop name
    • field2 to Coordinates
  4. For the Coordinates field, change the data type to Geopoint.

    image

  5. In the top-right corner, click Save.

  6. Enter geocoder_data for the dataset name and click Create.

Create a chartCreate a chart

  1. In the top-right corner, click Create chart.

  2. Select the Map visualization type.

  3. Drag the Coordinates field to the Points (Geopoints) section.

  4. Drag the Shop name field to the Tooltips section.

    image

  5. In the top-right corner, click Save.

  6. Enter the chart name and click Save.

How to delete the resources you createdHow to delete the resources you created

If you no longer plan to use the DataSphere project, delete it.

ClickHouse® is a registered trademark of ClickHouse, Inc.

Was the article helpful?

Previous
Analyzing sales and locations of pizzerias based on data from the ClickHouse® database and Marketplace
Next
Importing data from Object Storage, processing and exporting to Managed Service for ClickHouse®
© 2025 Direct Cursus Technology L.L.C.