Apigee Edge is an API management platform that allows developers to secure, manage, scale and analyze their APIs and API traffic. The SwaggerHub integration with Edge lets you export your API definition as an API proxy in Edge. The created proxy will route requests to your existing backend API while adding extra features like security, rate limiting, quotas, analytics, and more.
Supported versions
- Apigee Edge Cloud
- Apigee Edge for Private Cloud
If you use Edge for Private Cloud, you must have already configured an organization and environment.
Supported OpenAPI versions
This integration supports OpenAPI 2.0 and OpenAPI 3.0. SwaggerHub On-Premise users need v. 1.21.0 or later to use OpenAPI 3.0 definitions with Apigee Edge.
Considerations
The integration requires an Apigee user account with a login and password. Accounts using two-factor auth or SAML are not supported.
If you use SwaggerHub SaaS and Edge for Private Cloud, your Edge server must be accessible from the public Internet and allow connections from our IP addresses.
Your API definition must specify the
basePath
. It will be used as your proxy base path in Edge. For example, if your API hasbasePath: /api/v2
, the proxy’s public URL will behttp(s)://{orgname}-{environment}.apigee.net/api/v2
. An API with thebasePath
of/
will get a proxy at the root of your Edge environment. If you are going to add multiple APIs to Edge, make sure they have differentbasePath
values.We recommend that you specify the
operationId
for your API operations – they will be used as endpoint names in Edge.The proxy in Edge will be given a new revision every time you save the API in SwaggerHub. New revisions are not deployed automatically. To activate the changes, you will have to go back to the Edge console and deploy the new revision manually. The procedure is explained below.
The integration is one-way, meaning your API definitions go from SwaggerHub to Apigee Edge, but not vice versa. You should edit your API definitions in SwaggerHub only.
The integration is tied to a specific version of an API. If your API has multiple versions, you can configure the integrations separately for different versions, provided that these versions have different
basePath
values.
How the integration works
The integration is triggered every time you save your API in SwaggerHub. If a proxy does not exist in Edge, it will be created, and then a new revision of the proxy will be created upon each save. The proxy name is saved in the x-apigee-id
key at the root level of your API definition.
The proxy serves as a management layer in front of your existing backend API and allows you to attach various policies to manage the traffic, add authentication, transform the payload and so on. These policies are configured in the Edge console. Note that new revisions of the proxy will not inherit the policies of previous revisions, so you will have to re-configure the policies manually for new revisions of the proxy.
After configuring the proxy, you can deploy it to the production or test environment in Edge. The integration does not deploy the proxy automatically. This must be done manually and repeated whenever you update your API definition in SwaggerHub.
A deployed proxy has a public-facing URL in the format http(s)://{orgname}-{environment}.apigee.net/{basePath}
that the API clients can use to access the API. You can then specify the proxy as a host
for your API in SwaggerHub in order to test the proxy using the “try it out” button.
The procedure is explained in more detail below.
Step 1. Configure the integration
Open your API page in SwaggerHub.
If the API has several versions, select the version you want to push to Apigee Edge.
If this version is published, unpublish it. Only unpublished APIs can be integrated with Apigee Edge.
Click the API name, switch to the Integrations tab, and click Add New Integrations.
Select Apigee Edge from the list of integrations.
Enter the integration parameters (all are required):
Name– A display name for this integration, for example, Apigee.
Apigee Edge Server– The URL used to access the Edge management API. If you use Apigee Edge Cloud, leave the default value https://api.enterprise.apigee.com/v1, otherwise, replace it with the corresponding URL for your private Edge instance.
Apigee email and Apigee account password– The email and password you use to log in to Apigee Edge.
(Video) Apigee X IntegrationClick Update to verify the credentials and display other settings.
Organization– The organization within your Edge account where the API proxy will be created. You can see your organization names in your username menu in Apigee Edge.
Click Next to verify the organization permissions and display other settings.
API Name– A unique name for the created API proxy. Valid characters are letters, numbers, dash (-), and underscore (_). You can also specify the name of an existing API proxy - in this case, the integration will create a new revision of this proxy.
The proxy name will also be saved in the
x-apigee-id
key in your API definition.Target URL– The endpoint to which Apigee Edge will route the requests. For example, http://api.example.com/reports/v2 or https://myapi.com. If your API definition specifies the
host
,basePath
andscheme
, this isscheme://host/basePath
.Enabled– Enables or disables the integration.
Click Create and Execute, then Done.
SwaggerHub will create or update the specified API proxy, and from now on will create a new revision of the proxy every time you save your API definition.
Step 2. Deploy API proxy to production
The created API proxy is not deployed automatically. You need to deploy the proxy manually, and redeploy it whenever you update the API definition in SwaggerHub.
To deploy the proxy:
In Apigee Edge, click your username and select the organization from the menu.
Click API Proxies. You will see the list of existing API proxies for your organization.
Click the image to enlarge it.
(Video) Apigee Edge - 4MV4D - Drupal Developer Portal - Publish API Documentation - Open API Spec - S04E06Click the proxy created for your API. This will display the latest revision of the proxy.
Click the image to enlarge it.
Expand the Proxy Endpoints to verify that the operation list matches your API definition. The endpoint names in the Endpoint Flow Name column are taken from the
operationId
specified in your API definition.Click the image to enlarge it.
Click the Deployment button at the center and select the prod environment.
(Video) Apigee demoClick the image to enlarge it.
Your API proxy is now deployed to production.
At this point, the proxy is a simple pass-through proxy that forwards all requests to your backend API. If you wish, you can attach policies to the proxy that would implement rate limits and authentication, transform requests or responses, or augment your APIs in other ways. Refer to the Apigee Edge documentation to learn more about using policies.
Note: | New revisions of the proxy do not inherit the policies of previous revisions, so you will need to reconfigure the policies when a new revision is created. |
Step 3. Update host for interactive API docs
Now that your API proxy is deployed, you can change the host
and schemes
in your API definition to point to the proxy. This way, the requests made from the interactive API docs in SwaggerHub will be directed to this proxy, and the proxy will forward them to your original backend API.
You can find the base URL of the proxy in its Edge:
Click the image to enlarge it.
Specify the corresponding values in your spec in SwaggerHub:
host: hkosova-trial-prod.apigee.netbasePath: /v2schemes: - https - http
Now, you can test the proxy using the “try it out” button in SwaggerHub.
Note that changing the host in the spec will not affect the target host configured for the proxy in Apigee Edge, because the target host is taken from the integration settings and not from the spec.
Disable the integration
If you no longer wish to push your API definition to Apigee Edge, you can either delete the integration or disable it. Your API proxy will remain in Apigee Edge, but SwaggerHub will not update it anymore.
See Also
SwaggerHub Integrations
FAQs
What is the response limit for Apigee? ›
Feature area | Limit | Apigee hybrid Currently enforced? |
---|---|---|
Target connection timeout | 600 seconds | Planned |
API proxy request URL size | 10 KB | Yes |
Request header size | 60 KB | Yes |
Response header size | 60 KB | Yes |
- Take the API-first approach to application integration.
- Quickly configure integrations with intuitive drag and drop interface.
- Monitor and track the health of integrations with built-in monitoring.
- Reduce the risk associated with data connectivity challenges.
Apigee enables you to provide secure access to your services with a well-defined API that is consistent across all of your services, regardless of service implementation. A consistent API: Makes it easy for app developers to consume your services.
What is success code in Apigee? ›By default, Apigee treats HTTP code 4XX or 5XX as errors, and it treats HTTP code 1XX , 2XX , 3XX as success. This property enables explicit definition of success codes, for example, 2XX, 1XX, 505 treats any 100 , 200 and 505 HTTP response codes as success.
What is the maximum API response? ›General quota limits
10 queries per second (QPS) per IP address. In the API Console, there is a similar quota referred to as Requests per 100 seconds per user. By default, it is set to 100 requests per 100 seconds per user and can be adjusted to a maximum value of 1,000.
API Gateway has account-level quotas, per Region. The throttle quota is 10,000 requests per second (RPS) with an additional burst capacity provided by the token bucket algorithm. The maximum bucket capacity is 5,000 requests per account and Region.
What is the difference between API Connect and Apigee? ›Apigee Edge is an API management platform now owned and offered by Google, since Google acquired Apigee in 2016. IBM API Connect is a scalable API solution that helps organizations implement a robust API strategy by creating, exposing, managing and monetizing an entire API ecosystem across multiple clouds.
What is the difference between Apigee and API gateway? ›Functionality: Both platforms offer a variety of features and capabilities. However, Apigee is a full API management product that provides more out-of-the-box functionality. Amazon API Gateway, on the other hand, is simply an API gateway that relays requests and supports integration with other AWS services.
What are the three core services in Apigee edge? ›Apigee Edge consists of API runtime, monitoring and analytics, and developer services that together provide a comprehensive infrastructure for API creation, security, management, and operations.
What are roles in Apigee edge? ›Custom roles let you apply fine-grained permissions to these Apigee Edge entities such as API proxies, products, developer apps, developers, and custom reports. You can create and configure custom roles either through the UI or using APIs. See Creating custom roles in the UI and Creating roles with the API.
What is basic authentication in Apigee edge? ›
With Basic Authentication, you pass your credentials (your Apigee account's email address and password) in each request to the Edge API. Basic Authentication is the least secure of the supported authentication mechanisms. Your credentials are not encrypted or hashed; they are Base64-encoded only.
Is there any coding in Apigee? ›The Apigee Edge supported multiple languages like JavaScript, Java, Python, and Node. js for API management. It allows configurable policies like API essential verification, Access control, OAuth, JWT, etc. It is a cross-functional portal for both in the cloud or on-premises.
What is flow hooks in Apigee? ›With a flow hook, you attach a shared flow so that it executes at the same place for all API proxies deployed to a specific environment. This gives you a separately implemented and deployed sequence of logic that is not part of a proxy's implementation code.
What is 502 in Apigee? ›An Apigee server is serving as a gateway between the backend and the client. If a request takes more than 2 minutes to complete (which can happen if the backend calls another backend service, that does some long calculations), it yields 502 Bad Gateway on the front-end.
How many API requests is too many? ›In the API Console, there is a similar quota referred to as Requests per 100 seconds per user. By default, it is set to 100 requests per 100 seconds per user and can be adjusted to a maximum value of 1,000. But the number of requests to the API is restricted to a maximum of 10 requests per second per user.
How long should an API call take? ›Generally, APIs that are considered high-performing have an average response time between 0.1 and one second. At this speed, end users will likely not experience any interruption. At around one to two seconds, users begin to notice some delay.
What is the maximum API payload size? ›Resource or operation | Default quota | Can be increased |
---|---|---|
Total combined size of request line and header values | 10240 bytes | No |
Payload size | 10 MB | No |
Custom domains per account per Region | 120 | Yes |
Access log template size | 3 KB | No |
API Gateway has a response payload limit of 10 MB. That's well above Lambda's limit. Therefore, it's fair to assume that any payload size problems are caused by Lambda and your code running there.
What is the maximum response time for API gateway? ›The integration timeout is 29 seconds (a hard limit) for all API Gateway integrations. You might encounter two scenarios when you build an API Gateway API with Lambda integration. The scenarios are when the timeout is less than 29 seconds or greater than 29 seconds.
Is there a limit to rest API response length? ›"By default, the system limits the size of REST responses that are not saved as attachments to 5 MB. Direct REST responses that exceed this limit generate an error. To support larger response sizes, either save the response as an attachment or increase the response size limit with the glide. pf.
What is API gateway rate limit? ›
The default rate limit is 10.000 requests per second, and the default burst limit is 5000 requests. It is possible to increase this limit, permitting that it does not exceed AWS's theoretical regional limits. Account-level throttling is enabled by default.