Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
On the Overview screen, have the option to select multiple registers to Assign or delete will improve the user experience by reducing the time if change multiple Checklists is needed.
I am busy with Cognite Training:Data Engineer Basics > Python SDK Transformations (Hands On)I have completed the ‘1. Environment setup’ without any problems. At the end of that, I have a variable defined However, the next step is to create the database in CDF:This gives the error, which looks like an access error: CogniteAPIError: Unauthorized | code: 401 Please could someone advise me what is to be done to correct?Thank you.Doug
What access capabilities do I need to run transformations as “Current User”. I have a user who don’t see “Run as current user” as in screen shot. Next screen shot is mine, and I can see it probably because I am added as admin
CDF - Filter option is not working as expected under common filters at Data Explorer screen. Login to CDF Click on Data Explorer tab in CDF menu bar. Click on Files tab in right side of the panel. Set Data set as 'src:006:documentum:b60:ds under Common filters in left side of the screen. Select the check box ‘Before’ under common filters in left side of the penal. Click on the Calendar icon and set data as (e.g.) '10-01-2023' Expected results: Document ‘Amarjeet_Test_DT.docx’ should not display in results window because its created after the set date.Actual results: Document Amarjeet_Test_DT.docx is displaying in CDFNote Issue exists for all Date filters like Created time, updated time with Before, After, During in CDF, user want to know what date is used for filtering the documents in CDF with these filters.
This discussion is linked to the course Cognite Data Fusion for Domain Experts.A data-driven approach to maintenance is more efficient than relying on a fixed schedule or fixing something once it’s already broken. There are plenty of opportunities around us to take advantage of smart maintenance. Look around in your home, your office, or think of your work tasks. Where could data-driven maintenance make a difference?Share your thoughts! For example, explain what data you have available or would need in order to do smart maintenance, and how this would be an improvement of current methods.
Hi Everyone! I’m excited to be here to learn more about Cognite’s solutions for industry.
Hello, When I tried to run the DBExtractor, I get the following error: “polars\_cpu_check.py:232: RuntimeWarning: Missing required CPU features.The following required CPU features were not detected: avx, avx2, fmaContinuing to use this version of Polars on this processor will likely result in a crash.Install the `polars-lts-cpu` package instead of `polars` to run Polars with better compatibility.Hint: If you are on an Apple ARM machine (e.g. M1) this is likely due to running Python under Rosetta.It is recommended to install a native version of Python that does not run under Rosetta x86-64 emulation.If you believe this warning to be a false positive, you can set the `POLARS_SKIP_CPU_CHECK` environment variable to bypass this check.”After doing some googling, I was able to install the polars-lts-cpu package referenced using python, but I got the same error. I’m not sure how to make the extractor reference the polars-lts-cpu package when it runs. See attached screenshot.The extractor i
I am facing this error in the data science course on “creating cognite functions” with cognite SDK. In previous courses I had fixed this error by replacing the “datapoints” keyword with “time_series”. However, I would like to know if I am probably not using the write packages; OR not know if the commands are deprecated and have new function names in newer versions. Kindly let me know - I am trying to finish these courses before my local boot camp next week.Thanks,Lavanya
Data Workflow UI stopped rendering the workflow with the following error message. It stopped working as soon as I created a CDF function to trigger the workflow. https://delfi-us.fusion.cognite.com/shaya-dev/flows/SDM-ARP-Model-Refresh?cluster=az-eastus-1.cognitedata.com&env=az-eastus-1
Deployment of workflow , can you guide how we can automate deployment of cognite workflows using sdk
Is the following use case a good fit for Data Workflows?Cognite Function that is reading data from Time Series and writing "event-like" data into Data Modeling continuous processing for metric type A ideally an execution every minute, but more importantly no overlapping executions (i.e., if execution takes longer than a minute) passing state from one to another function execution hourly processing for metrics type B Idea:2 Data workflows:Hourly execution of trigger Cognite Function to calculate metric B Daily execution of a workflow that triggers Cognite Function that recursively outputs a dynamic task, which calculates metric A. The Function outputs information for the next execution run (i.e., timestamp for next execution, state, and other info). At the end of the day the dynamic task would not output a timestamp for the next execution and workflow would complete. The execution trigger could also be daily.
Hi Team, Is there any possibility that I can register for Bootcamp virtually or it can be in India location as I am from India.Also, want to know is the Bootcamp is for individual or group and the cost for attending the Bootcamp? Thanks,Navyasri Indupalli
Hey everyone! My name is Sachin. I am a data professional come data & programming world. I’m excited to be part of this community and I look forward to finding and giving help with everything related to Data & Analytics. I’m here coz I am excited to learn more about Cognite data journey, products & services. I’ve been a data architect & data engineer in financial technology industry for close to 12 years now and look forward to exploring new industry to enhance my knowledge.
This discussion is linked to the course Industrial CanvasCognite’s Industrial Canvas provides a digital whiteboard where contextualized data from various sources can be gathered in a single space. Data such as P&IDs, time series, assets, and events can be imported from a CDF project into the canvas. Having all relevant data in a single digital workspace removes the dependency on manual workflows and opens the door for efficient collaboration between engineers!Think about your everyday workflow, what tasks do you think can be done more efficiently when using Industrial Canvas? Do you often need to gather troubleshooting data into one canvas for easier analysis? Maybe you often need to view timeseries in collaboration with a remote colleague? Share your thoughts with us and get inspired by people from different domains explaining how they would use Industrial Canvas in their everyday work! __Omar | Cognite Academy
Currently, the data model and data model instance actions are limited to read and write capabilities.To enhance clarity and delineate responsibilities within CDF groups more effectively, I propose dividing the existing configuration into distinct categories: read, write/update, and delete.Another enhancement is to disallow the deletion of containers that have associated views. The same principle applies to views; they cannot be deleted if other components (views or data models) are referencing them.
After we released the first version of time series data quality monitoring, we've seen an unprecedented demand for ensuring that time series data is continuously reliable.It's important for end users of apps and dashboards to know when they can -- and importantly when they cannot -- rely on data to make operational decisions. That means that you as a data scientist or application developer need to communicate to the end user what the data quality status is.You can now easily report live data quality status in dashboards and applications using our data quality monitoring service. The monitoring service creates live data quality metrics, available as time series in Cognite Data Fusion, that you can display in your application -- ensuring that end users know the quality of the data that they are using.Try it out! Learn more in our guide to report data quality status in apps and dashboards.To enable this in your project the service account used by the data quality monitoring service needs
Hey team,I was wondering if there is going to be an alignment on what we call this tool across the many places it is referenced? I’ve found these 5 so far:This hub group: CDF Toolkit cdf-tk and project templates github: cdf-project-templates pypi: cognite-toolkit command line: cdf-tkI’m personally a fan of cdf-tk.cognite-toolkit is too broad (doesn’t include InRobot, Maintain, f25e). cdf-project-templates is too many words for a CLI. project-templates is too broad.
Hey Data Workflows users!TL;DR If you’re using the workflow execution cancellation endpoint, with the python SDK or without, you will have to update your workflow execution cancellation calls before 29/05/2024. More information below.In our push towards making the Data Workflows API generally available in the June release, we aim to ensure a consistent and expected experience across the API. For this reason, we’ve decided to make a breaking change to the workflow execution cancellation endpoint. Endpoint changesIn summary, the cancellation endpoint now only allows the cancellation of a single execution at a time instead of allowing multiple executions to be cancelled in one call. The updated API documentation can be found here. The python SDK, starting from version 7.42.0, will now point to the new endpoint. An example call to the new endpoint using the Python SDK can be found here.TimelineThe new endpoint is already available for use. To assist with the transition, the old endpoint wi
Currently on the PI extractor configuration file you can specify the end timestamp for the backfilling of a timeseries by using the “to” parameter on the backfill section of the configuration file. What we would expect is to have something relatively close to this set date. However, the docs say that it can overshoot by the amount of datapoints specified on the filling chunk size, this results in over-ingestion of datapoints. We would like to have a way to control the backfilling more precisely since we currently cannot limit the overshooting of the backfilling without also affecting the front-filling given that both depend on the same data point chunk size parameter (cdf-chunking>data-points).
After how much time retry of function get triggered on failing?Is there a way we can set the time to retrigger the function after function fails. It will be helpful to deal with rate limit in functions if any.
As beneficial as inspection robots are, they might suffer from limitations if they are not trained in an environment simulating the industrial environment they will be deployed at. Therefore, Cognite gathered forces with Aker Solutions, TESS, and Createc, to build an innovative testing and training facility for inspection robots, called Robot Garden, which will help take robotics' impact on industrial safety and efficiency to new heights. Robot Garden is located at Fornebu, Oslo, and acts as a local test and training facility to train robots in a realistic environment, similar to the environment where they will be deployed, thus enhancing their mission efficiency and helping robot users get the most out of their robot deployments. The main driver behind Robot Garden is to test the robustness of the training AI models in different challenging deployment scenarios such as bad weather, bad lighting, uneven terrain, etc.In addition, TESS provides a simulation panel that can simulate vari
HiThere are some bugs when doing contextualization in the fusion GUI. It should be possible to “select all” when i do a search query in the interactive engineering diagrams contextualization workflow.
We would like to have a contains filter option on columns. This would be useful to search on string type columns and see if it contains some substring, to filter the data properly.
In multiple cases, the validation of a Checklist is not needed and thoses checklists will remain pending, depending on the Team Captain avaliability. And this could also require time from the Team Captain to approve all the pending Checklists.The suggestion is to create an option on the Template creation screen so the user can define if it’s necessary to have an approval or not for the checklists generated from that Template and a possibility to define the group of users that will be able to approve it.As complement, new status for Checklists could also be created to indicate that a Checklist was finished but needs to be approved and a status to signalize the approved.
Hello community, my name is Dan Riley, Analytics Manager with Interstates, Inc. Looking forward to learning more about CDF!
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.