aito.api.upload_entries

aito.api.upload_entries(client: aito.client.aito_client.AitoClient, table_name: str, entries: Iterable[Dict], batch_size: int = 1000, optimize_on_finished: bool = True)

populate table entries by batches of batch_size

Note

requires the client to be setup with the READ-WRITE API key

Parameters
  • client (AitoClient) – the AitoClient instance

  • table_name (str) – the name of the table

  • entries (Iterable[Dict]) – iterable of the table entries

  • batch_size (int, optional) – the batch size, defaults to 1000

  • optimize_on_finished (bool) – optimize the table on finished, defaults to True

Upload a Pandas DataFrame

>>> import pandas as pd
>>> df = pd.DataFrame({'height': [94, 170], 'weight': [31, 115], 'depth':  [ 3,  29]})
>>> entries = df.to_dict(orient='records')
>>> client.upload_entries(table_name='specifications', entries=entries) 

Upload a generator of entries

>>> def entries_generator(start, end):
...     for idx in range(start,clea end):
...         entry = {'id': idx}
...         yield entry
>>> client.upload_entries(
...     table_name="table_name",
...     entries=entries_generator(start=0, end=10000),
...     batch_size=500,
...     optimize_on_finished=False
... )