Lookoutequipment

This page documents function available when using the Lookoutequipment module, created with @service Lookoutequipment.

Index

Documentation

Main.Lookoutequipment.create_datasetMethod
create_dataset(client_token, dataset_name)
create_dataset(client_token, dataset_name, params::Dict{String,<:Any})

Creates a container for a collection of data being ingested for analysis. The dataset contains the metadata describing where the data is and what the data actually looks like. In other words, it contains the location of the data source, the data schema, and other information. A dataset also contains any tags associated with the ingested data.

Arguments

  • client_token: A unique identifier for the request. If you do not set the client request token, Amazon Lookout for Equipment generates one.
  • dataset_name: The name of the dataset being created.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "DatasetSchema": A JSON description of the data that is in each time series dataset, including names, column names, and data types.
  • "ServerSideKmsKeyId": Provides the identifier of the KMS key used to encrypt dataset data by Amazon Lookout for Equipment.
  • "Tags": Any tags associated with the ingested data described in the dataset.
source
Main.Lookoutequipment.create_inference_schedulerMethod
create_inference_scheduler(client_token, data_input_configuration, data_output_configuration, data_upload_frequency, inference_scheduler_name, model_name, role_arn)
create_inference_scheduler(client_token, data_input_configuration, data_output_configuration, data_upload_frequency, inference_scheduler_name, model_name, role_arn, params::Dict{String,<:Any})

Creates a scheduled inference. Scheduling an inference is setting up a continuous real-time inference plan to analyze new measurement data. When setting up the schedule, you provide an S3 bucket location for the input data, assign it a delimiter between separate entries in the data, set an offset delay if desired, and set the frequency of inferencing. You must also provide an S3 bucket location for the output data.

Arguments

  • client_token: A unique identifier for the request. If you do not set the client request token, Amazon Lookout for Equipment generates one.
  • data_input_configuration: Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.
  • data_output_configuration: Specifies configuration information for the output results for the inference scheduler, including the S3 location for the output.
  • data_upload_frequency: How often data is uploaded to the source Amazon S3 bucket for the input data. The value chosen is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment runs inference on your data. For more information, see Understanding the inference process.
  • inference_scheduler_name: The name of the inference scheduler being created.
  • model_name: The name of the previously trained ML model being used to create the inference scheduler.
  • role_arn: The Amazon Resource Name (ARN) of a role with permission to access the data source being used for the inference.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "DataDelayOffsetInMinutes": The interval (in minutes) of planned delay at the start of each inference segment. For example, if inference is set to run every ten minutes, the delay is set to five minutes and the time is 09:08. The inference scheduler will wake up at the configured interval (which, without a delay configured, would be 09:10) plus the additional five minute delay time (so 09:15) to check your Amazon S3 bucket. The delay provides a buffer for you to upload data at the same frequency, so that you don't have to stop and restart the scheduler when uploading new data. For more information, see Understanding the inference process.
  • "ServerSideKmsKeyId": Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment.
  • "Tags": Any tags associated with the inference scheduler.
source
Main.Lookoutequipment.create_labelMethod
create_label(client_token, end_time, label_group_name, rating, start_time)
create_label(client_token, end_time, label_group_name, rating, start_time, params::Dict{String,<:Any})

Creates a label for an event.

Arguments

  • client_token: A unique identifier for the request to create a label. If you do not set the client request token, Lookout for Equipment generates one.
  • end_time: The end time of the labeled event.
  • label_group_name: The name of a group of labels. Data in this field will be retained for service usage. Follow best practices for the security of your data.
  • rating: Indicates whether a labeled event represents an anomaly.
  • start_time: The start time of the labeled event.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "Equipment": Indicates that a label pertains to a particular piece of equipment. Data in this field will be retained for service usage. Follow best practices for the security of your data.
  • "FaultCode": Provides additional information about the label. The fault code must be defined in the FaultCodes attribute of the label group. Data in this field will be retained for service usage. Follow best practices for the security of your data.
  • "Notes": Metadata providing additional information about the label. Data in this field will be retained for service usage. Follow best practices for the security of your data.
source
Main.Lookoutequipment.create_label_groupMethod
create_label_group(client_token, label_group_name)
create_label_group(client_token, label_group_name, params::Dict{String,<:Any})

Creates a group of labels.

Arguments

  • client_token: A unique identifier for the request to create a label group. If you do not set the client request token, Lookout for Equipment generates one.
  • label_group_name: Names a group of labels. Data in this field will be retained for service usage. Follow best practices for the security of your data.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "FaultCodes": The acceptable fault codes (indicating the type of anomaly associated with the label) that can be used with this label group. Data in this field will be retained for service usage. Follow best practices for the security of your data.
  • "Tags": Tags that provide metadata about the label group you are creating. Data in this field will be retained for service usage. Follow best practices for the security of your data.
source
Main.Lookoutequipment.create_modelMethod
create_model(client_token, dataset_name, model_name)
create_model(client_token, dataset_name, model_name, params::Dict{String,<:Any})

Creates an ML model for data inference. A machine-learning (ML) model is a mathematical model that finds patterns in your data. In Amazon Lookout for Equipment, the model learns the patterns of normal behavior and detects abnormal behavior that could be potential equipment failure (or maintenance events). The models are made by analyzing normal data and abnormalities in machine behavior that have already occurred. Your model is trained using a portion of the data from your dataset and uses that data to learn patterns of normal behavior and abnormal patterns that lead to equipment failure. Another portion of the data is used to evaluate the model's accuracy.

Arguments

  • client_token: A unique identifier for the request. If you do not set the client request token, Amazon Lookout for Equipment generates one.
  • dataset_name: The name of the dataset for the ML model being created.
  • model_name: The name for the ML model to be created.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "DataPreProcessingConfiguration": The configuration is the TargetSamplingRate, which is the sampling rate of the data after post processing by Amazon Lookout for Equipment. For example, if you provide data that has been collected at a 1 second level and you want the system to resample the data at a 1 minute rate before training, the TargetSamplingRate is 1 minute. When providing a value for the TargetSamplingRate, you must attach the prefix "PT" to the rate you want. The value for a 1 second rate is therefore PT1S, the value for a 15 minute rate is PT15M, and the value for a 1 hour rate is PT1H
  • "DatasetSchema": The data schema for the ML model being created.
  • "EvaluationDataEndTime": Indicates the time reference in the dataset that should be used to end the subset of evaluation data for the ML model.
  • "EvaluationDataStartTime": Indicates the time reference in the dataset that should be used to begin the subset of evaluation data for the ML model.
  • "LabelsInputConfiguration": The input configuration for the labels being used for the ML model that's being created.
  • "OffCondition": Indicates that the asset associated with this sensor has been shut off. As long as this condition is met, Lookout for Equipment will not use data from this asset for training, evaluation, or inference.
  • "RoleArn": The Amazon Resource Name (ARN) of a role with permission to access the data source being used to create the ML model.
  • "ServerSideKmsKeyId": Provides the identifier of the KMS key used to encrypt model data by Amazon Lookout for Equipment.
  • "Tags": Any tags associated with the ML model being created.
  • "TrainingDataEndTime": Indicates the time reference in the dataset that should be used to end the subset of training data for the ML model.
  • "TrainingDataStartTime": Indicates the time reference in the dataset that should be used to begin the subset of training data for the ML model.
source
Main.Lookoutequipment.delete_datasetMethod
delete_dataset(dataset_name)
delete_dataset(dataset_name, params::Dict{String,<:Any})

Deletes a dataset and associated artifacts. The operation will check to see if any inference scheduler or data ingestion job is currently using the dataset, and if there isn't, the dataset, its metadata, and any associated data stored in S3 will be deleted. This does not affect any models that used this dataset for training and evaluation, but does prevent it from being used in the future.

Arguments

  • dataset_name: The name of the dataset to be deleted.
source
Main.Lookoutequipment.delete_inference_schedulerMethod
delete_inference_scheduler(inference_scheduler_name)
delete_inference_scheduler(inference_scheduler_name, params::Dict{String,<:Any})

Deletes an inference scheduler that has been set up. Already processed output results are not affected.

Arguments

  • inference_scheduler_name: The name of the inference scheduler to be deleted.
source
Main.Lookoutequipment.delete_labelMethod
delete_label(label_group_name, label_id)
delete_label(label_group_name, label_id, params::Dict{String,<:Any})

Deletes a label.

Arguments

  • label_group_name: The name of the label group that contains the label that you want to delete. Data in this field will be retained for service usage. Follow best practices for the security of your data.
  • label_id: The ID of the label that you want to delete.
source
Main.Lookoutequipment.delete_label_groupMethod
delete_label_group(label_group_name)
delete_label_group(label_group_name, params::Dict{String,<:Any})

Deletes a group of labels.

Arguments

  • label_group_name: The name of the label group that you want to delete. Data in this field will be retained for service usage. Follow best practices for the security of your data.
source
Main.Lookoutequipment.delete_modelMethod
delete_model(model_name)
delete_model(model_name, params::Dict{String,<:Any})

Deletes an ML model currently available for Amazon Lookout for Equipment. This will prevent it from being used with an inference scheduler, even one that is already set up.

Arguments

  • model_name: The name of the ML model to be deleted.
source
Main.Lookoutequipment.describe_data_ingestion_jobMethod
describe_data_ingestion_job(job_id)
describe_data_ingestion_job(job_id, params::Dict{String,<:Any})

Provides information on a specific data ingestion job such as creation time, dataset ARN, and status.

Arguments

  • job_id: The job ID of the data ingestion job.
source
Main.Lookoutequipment.describe_datasetMethod
describe_dataset(dataset_name)
describe_dataset(dataset_name, params::Dict{String,<:Any})

Provides a JSON description of the data in each time series dataset, including names, column names, and data types.

Arguments

  • dataset_name: The name of the dataset to be described.
source
Main.Lookoutequipment.describe_inference_schedulerMethod
describe_inference_scheduler(inference_scheduler_name)
describe_inference_scheduler(inference_scheduler_name, params::Dict{String,<:Any})

Specifies information about the inference scheduler being used, including name, model, status, and associated metadata

Arguments

  • inference_scheduler_name: The name of the inference scheduler being described.
source
Main.Lookoutequipment.describe_labelMethod
describe_label(label_group_name, label_id)
describe_label(label_group_name, label_id, params::Dict{String,<:Any})

Returns the name of the label.

Arguments

  • label_group_name: Returns the name of the group containing the label.
  • label_id: Returns the ID of the label.
source
Main.Lookoutequipment.describe_label_groupMethod
describe_label_group(label_group_name)
describe_label_group(label_group_name, params::Dict{String,<:Any})

Returns information about the label group.

Arguments

  • label_group_name: Returns the name of the label group.
source
Main.Lookoutequipment.describe_modelMethod
describe_model(model_name)
describe_model(model_name, params::Dict{String,<:Any})

Provides a JSON containing the overall information about a specific ML model, including model name and ARN, dataset, training and evaluation information, status, and so on.

Arguments

  • model_name: The name of the ML model to be described.
source
Main.Lookoutequipment.list_data_ingestion_jobsMethod
list_data_ingestion_jobs()
list_data_ingestion_jobs(params::Dict{String,<:Any})

Provides a list of all data ingestion jobs, including dataset name and ARN, S3 location of the input data, status, and so on.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "DatasetName": The name of the dataset being used for the data ingestion job.
  • "MaxResults": Specifies the maximum number of data ingestion jobs to list.
  • "NextToken": An opaque pagination token indicating where to continue the listing of data ingestion jobs.
  • "Status": Indicates the status of the data ingestion job.
source
Main.Lookoutequipment.list_datasetsMethod
list_datasets()
list_datasets(params::Dict{String,<:Any})

Lists all datasets currently available in your account, filtering on the dataset name.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "DatasetNameBeginsWith": The beginning of the name of the datasets to be listed.
  • "MaxResults": Specifies the maximum number of datasets to list.
  • "NextToken": An opaque pagination token indicating where to continue the listing of datasets.
source
Main.Lookoutequipment.list_inference_eventsMethod
list_inference_events(inference_scheduler_name, interval_end_time, interval_start_time)
list_inference_events(inference_scheduler_name, interval_end_time, interval_start_time, params::Dict{String,<:Any})

Lists all inference events that have been found for the specified inference scheduler.

Arguments

  • inference_scheduler_name: The name of the inference scheduler for the inference events listed.
  • interval_end_time: Returns all the inference events with an end start time equal to or greater than less than the end time given
  • interval_start_time: Lookout for Equipment will return all the inference events with an end time equal to or greater than the start time given.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "MaxResults": Specifies the maximum number of inference events to list.
  • "NextToken": An opaque pagination token indicating where to continue the listing of inference events.
source
Main.Lookoutequipment.list_inference_executionsMethod
list_inference_executions(inference_scheduler_name)
list_inference_executions(inference_scheduler_name, params::Dict{String,<:Any})

Lists all inference executions that have been performed by the specified inference scheduler.

Arguments

  • inference_scheduler_name: The name of the inference scheduler for the inference execution listed.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "DataEndTimeBefore": The time reference in the inferenced dataset before which Amazon Lookout for Equipment stopped the inference execution.
  • "DataStartTimeAfter": The time reference in the inferenced dataset after which Amazon Lookout for Equipment started the inference execution.
  • "MaxResults": Specifies the maximum number of inference executions to list.
  • "NextToken": An opaque pagination token indicating where to continue the listing of inference executions.
  • "Status": The status of the inference execution.
source
Main.Lookoutequipment.list_inference_schedulersMethod
list_inference_schedulers()
list_inference_schedulers(params::Dict{String,<:Any})

Retrieves a list of all inference schedulers currently available for your account.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "InferenceSchedulerNameBeginsWith": The beginning of the name of the inference schedulers to be listed.
  • "MaxResults": Specifies the maximum number of inference schedulers to list.
  • "ModelName": The name of the ML model used by the inference scheduler to be listed.
  • "NextToken": An opaque pagination token indicating where to continue the listing of inference schedulers.
  • "Status": Specifies the current status of the inference schedulers to list.
source
Main.Lookoutequipment.list_label_groupsMethod
list_label_groups()
list_label_groups(params::Dict{String,<:Any})

Returns a list of the label groups.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "LabelGroupNameBeginsWith": The beginning of the name of the label groups to be listed.
  • "MaxResults": Specifies the maximum number of label groups to list.
  • "NextToken": An opaque pagination token indicating where to continue the listing of label groups.
source
Main.Lookoutequipment.list_labelsMethod
list_labels(label_group_name)
list_labels(label_group_name, params::Dict{String,<:Any})

Provides a list of labels.

Arguments

  • label_group_name: Retruns the name of the label group.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "Equipment": Lists the labels that pertain to a particular piece of equipment.
  • "FaultCode": Returns labels with a particular fault code.
  • "IntervalEndTime": Returns all labels with a start time earlier than the end time given.
  • "IntervalStartTime": Returns all the labels with a end time equal to or later than the start time given.
  • "MaxResults": Specifies the maximum number of labels to list.
  • "NextToken": An opaque pagination token indicating where to continue the listing of label groups.
source
Main.Lookoutequipment.list_modelsMethod
list_models()
list_models(params::Dict{String,<:Any})

Generates a list of all models in the account, including model name and ARN, dataset, and status.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "DatasetNameBeginsWith": The beginning of the name of the dataset of the ML models to be listed.
  • "MaxResults": Specifies the maximum number of ML models to list.
  • "ModelNameBeginsWith": The beginning of the name of the ML models being listed.
  • "NextToken": An opaque pagination token indicating where to continue the listing of ML models.
  • "Status": The status of the ML model.
source
Main.Lookoutequipment.list_sensor_statisticsMethod
list_sensor_statistics(dataset_name)
list_sensor_statistics(dataset_name, params::Dict{String,<:Any})

Lists statistics about the data collected for each of the sensors that have been successfully ingested in the particular dataset. Can also be used to retreive Sensor Statistics for a previous ingestion job.

Arguments

  • dataset_name: The name of the dataset associated with the list of Sensor Statistics.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "IngestionJobId": The ingestion job id associated with the list of Sensor Statistics. To get sensor statistics for a particular ingestion job id, both dataset name and ingestion job id must be submitted as inputs.
  • "MaxResults": Specifies the maximum number of sensors for which to retrieve statistics.
  • "NextToken": An opaque pagination token indicating where to continue the listing of sensor statistics.
source
Main.Lookoutequipment.list_tags_for_resourceMethod
list_tags_for_resource(resource_arn)
list_tags_for_resource(resource_arn, params::Dict{String,<:Any})

Lists all the tags for a specified resource, including key and value.

Arguments

  • resource_arn: The Amazon Resource Name (ARN) of the resource (such as the dataset or model) that is the focus of the ListTagsForResource operation.
source
Main.Lookoutequipment.start_data_ingestion_jobMethod
start_data_ingestion_job(client_token, dataset_name, ingestion_input_configuration, role_arn)
start_data_ingestion_job(client_token, dataset_name, ingestion_input_configuration, role_arn, params::Dict{String,<:Any})

Starts a data ingestion job. Amazon Lookout for Equipment returns the job status.

Arguments

  • client_token: A unique identifier for the request. If you do not set the client request token, Amazon Lookout for Equipment generates one.
  • dataset_name: The name of the dataset being used by the data ingestion job.
  • ingestion_input_configuration: Specifies information for the input data for the data ingestion job, including dataset S3 location.
  • role_arn: The Amazon Resource Name (ARN) of a role with permission to access the data source for the data ingestion job.
source
Main.Lookoutequipment.start_inference_schedulerMethod
start_inference_scheduler(inference_scheduler_name)
start_inference_scheduler(inference_scheduler_name, params::Dict{String,<:Any})

Starts an inference scheduler.

Arguments

  • inference_scheduler_name: The name of the inference scheduler to be started.
source
Main.Lookoutequipment.stop_inference_schedulerMethod
stop_inference_scheduler(inference_scheduler_name)
stop_inference_scheduler(inference_scheduler_name, params::Dict{String,<:Any})

Stops an inference scheduler.

Arguments

  • inference_scheduler_name: The name of the inference scheduler to be stopped.
source
Main.Lookoutequipment.tag_resourceMethod
tag_resource(resource_arn, tags)
tag_resource(resource_arn, tags, params::Dict{String,<:Any})

Associates a given tag to a resource in your account. A tag is a key-value pair which can be added to an Amazon Lookout for Equipment resource as metadata. Tags can be used for organizing your resources as well as helping you to search and filter by tag. Multiple tags can be added to a resource, either when you create it, or later. Up to 50 tags can be associated with each resource.

Arguments

  • resource_arn: The Amazon Resource Name (ARN) of the specific resource to which the tag should be associated.
  • tags: The tag or tags to be associated with a specific resource. Both the tag key and value are specified.
source
Main.Lookoutequipment.untag_resourceMethod
untag_resource(resource_arn, tag_keys)
untag_resource(resource_arn, tag_keys, params::Dict{String,<:Any})

Removes a specific tag from a given resource. The tag is specified by its key.

Arguments

  • resource_arn: The Amazon Resource Name (ARN) of the resource to which the tag is currently associated.
  • tag_keys: Specifies the key of the tag to be removed from a specified resource.
source
Main.Lookoutequipment.update_inference_schedulerMethod
update_inference_scheduler(inference_scheduler_name)
update_inference_scheduler(inference_scheduler_name, params::Dict{String,<:Any})

Updates an inference scheduler.

Arguments

  • inference_scheduler_name: The name of the inference scheduler to be updated.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "DataDelayOffsetInMinutes": A period of time (in minutes) by which inference on the data is delayed after the data starts. For instance, if you select an offset delay time of five minutes, inference will not begin on the data until the first data measurement after the five minute mark. For example, if five minutes is selected, the inference scheduler will wake up at the configured frequency with the additional five minute delay time to check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to stop and restart the scheduler when uploading new data.
  • "DataInputConfiguration": Specifies information for the input data for the inference scheduler, including delimiter, format, and dataset location.
  • "DataOutputConfiguration": Specifies information for the output results from the inference scheduler, including the output S3 location.
  • "DataUploadFrequency": How often data is uploaded to the source S3 bucket for the input data. The value chosen is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.
  • "RoleArn": The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference scheduler.
source
Main.Lookoutequipment.update_label_groupMethod
update_label_group(label_group_name)
update_label_group(label_group_name, params::Dict{String,<:Any})

Updates the label group.

Arguments

  • label_group_name: The name of the label group to be updated.

Optional Parameters

Optional parameters can be passed as a params::Dict{String,<:Any}. Valid keys are:

  • "FaultCodes": Updates the code indicating the type of anomaly associated with the label. Data in this field will be retained for service usage. Follow best practices for the security of your data.
source