Home AWS Cloud AWS CLOUD

AWS CLOUD

by Editorial Staff

SWF

  1. Amazon SWF enables applications for a range of use cases, including media processing, web application back ends, business process workflows, and analytic pipelines, to be designed as a coordination of tacks.
  2. Tasks are processed by workers which are programs that interact with Amazon SWF to get tasks, process them, and return the results. A worker implements an application processing step You can build workers in different programming languages and even recur existing components to quickly create the worker.
  3. SWF ensures that a task is assigned only once and is never duplicated.
  4. The maximum duration for a workflow within SWF is 1 year.
  5. AWS Flow Framework is a programming framework that enables you to develop Amazon SWF-based applications quickly and easily. It abstracts the details of task-level coordination and asynchronous interaction with simple programming constructs.
  6. Amazon SWF provides long-polling. Long-polling significantly reduces the number of polls that return without any tasks. When workers and deciders poll Amazon SWF for tasks, the connection is retained for a minute if no task is available. If a task does become available during that period, it is returned in response to the long-poll request.
  7. With AWS SWF you can use any programming language to write a worker or a decider, as long as you can communicate with Amazon SWF using web service APIs
  8. SWF Limits: 100 SWF Domains per account: 10,000 workflow activity types per domain.
  9. At any given time you can have 100,000 open executions in a domain.

Near real-time analytic with existing business intelligence tools and dashboards you’re already using today.

  1. Amazon Kinesis Data Firehose synchronously replicates data across three facilities in an AWS Region, providing high availability and durability for the data as it is transported to the destinations.
  2. A source is where your streaming data is continuously generated and captured. For example, a source can be a logging server running on Amazon EC2 instances, an application running on mobile devices, a sensor on an loT device, or a Kinesis stream.
  3. A shard is a uniquely identified group of data records in a streams. A stream is composed of one or more shards, each of which a fixed unit of capacity. Each shard can support up to 5 transactions per second for reads, up to a maximum total provides a f data read rate of 2 MB per second and up to 1,000 records per second for writes, up to a maximum total data write rate of 1 Mil per second (including partition keys). The data capacity of your stream is a function of the number of shards that you specify for the stream. The total capacity of the stream is the sum of the capacities of its shards.

You may also like

Leave a Comment

Contact US

Phone Number

+91 XXXXX XXXXX

Location

2nd floor, # 85, SGR Dental College Rd, Marathahalli, Bengaluru, Karnataka 560037

Email Address

contact@dbacentre.com

    dbacentre.com is a website that publishes practical and helpful out-of-the-box articles for aspirants like you and me. In addition, we seek to present exceptional, remarkable tutorials and tips that modern cloud professionals will appreciate.

     

    Subscribe now

    Edtior's Picks

    by mukilan
    Sql
    by mukilan

    ©2023 Mr DBA CENTRE. All Right Reserved. Designed and Developed by Begintech.in