Metadata-Version: 2.1
Name: CosmoTech-Acceleration-Library
Version: 0.4.4
Summary: Acceleration libraries for CosmoTech cloud based solution development
Author-email: Cosmo Tech <platform@cosmotech.com>
Project-URL: Homepage, https://www.cosmotech.com
Project-URL: Source, https://github.com/Cosmo-Tech/CosmoTech-Acceleration-Library
Project-URL: Documentation, https://cosmo-tech.github.io/CosmoTech-Acceleration-Library
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: azure-functions==1.17.0
Requires-Dist: azure-digitaltwins-core==1.2.0
Requires-Dist: azure-identity==1.15.0
Requires-Dist: azure-kusto-data==4.2.0
Requires-Dist: azure-kusto-ingest==4.2.0
Requires-Dist: redis==4.4.4
Requires-Dist: redisgraph_bulk_loader==0.10.2
Requires-Dist: cosmotech-api
Requires-Dist: openpyxl==3.1.2
Requires-Dist: pandas==2.1.2
Requires-Dist: python-dateutil==2.8.2
Provides-Extra: doc
Requires-Dist: mkdocs==1.5.3; extra == "doc"
Requires-Dist: mkdocs-gen-files==0.5.0; extra == "doc"
Requires-Dist: mkdocs-material==9.4.8; extra == "doc"
Requires-Dist: mkdocstrings[python]==0.23.0; extra == "doc"
Requires-Dist: mkdocs-literate-nav==0.6.1; extra == "doc"
Requires-Dist: pymdown-extensions==10.3.1; extra == "doc"
Requires-Dist: requirements-parser==0.5.0; extra == "doc"
Requires-Dist: mike==2.0.0; extra == "doc"

# CosmoTech-Acceleration-Library
Acceleration library for CosmoTech cloud based solution development

## Code organisation

In project root directory you'll find 4 main directories:

* CosmoTech_Acceleration_Library: containing all Cosmo Tech libraries to accelerate interaction with Cosmo Tech solutions
* data: a bunch of csv files on which samples are based
* samples: a bunch of python scripts to demonstrate how to use the library
* doc: for schema or specific documentation

## Accelerators

TODO

## Modelops library

The aim of this library is to simplify the model accesses via python code.

The library can be used by Data Scientists, Modelers, Developers, ...

### Utility classes

* `ModelImporter(host: str, port: int, name: str, version: int, graph_rotation:int = 1)` : will allow you to bulk import data from csv files with schema enforced (`samples/Modelops/Bulk_Import_from_CSV_with_schema.py`) or not (`samples/Modelops/Bulk_Import_from_CSV_without_schema.py`) (see [documentation](https://github.com/RedisGraph/redisgraph-bulk-loader#input-schemas) for further details)
* `ModelExporter(host: str, port: int, name: str, version: int, export_dir: str = '/')` : will allow you to export data from a model cache instance
* `ModelReader(host: str, port: int, name: str, version: int)` : will allow you to read data from a model cache instance ([object returned](https://github.com/RedisGraph/redisgraph-py/blob/master/redisgraph/query_result.py))
* `ModelWriter(host: str, port: int, name: str, version: int, graph_rotation:int = 1)` : will allow you to write data into a model instance
* `ModelUtil` : a bunch of utilities to manipulate and facilitate interaction with model instance (result_set_to_json, print_query_result, ... )
* `ModelMetadata`: will allow you to management graph metadata

## How-to

`python setup.py install --user`


