As per documentation ratelimit for Quotes10 request/second, but I am getting error while I am calling only one request per second

{‘status’: ‘error’, ‘errors’: [{‘errorCode’: ‘UDAPI10005’, ‘message’: ‘Too Many Request Sent’, ‘propertyPath’: None, ‘invalidValue’: None, ‘error_code’: ‘UDAPI10005’, ‘property_path’: None, ‘invalid_value’: None}]}

Is there any other limit threshold like per minute or per hour ?

Thank you for getting in touch.

Could you provide assistance regarding the particular API curl request causing the aforementioned problem?

It’s already mentioned in header , using below url

https://api-v2.upstox.com/market-quote/quotes?symbol=NSE_INDEX|Nifty%2050%2CNSE_INDEX|Nifty%20Bank%2CNSE_INDEX|India%20VIX%2CNSE_INDEX|Nifty%20Fin%20Service%2CNSE_INDEX|Nifty%20Fin%20Service%2CNSE_INDEX|NIFTY%20MID%20SELECT%2CBSE_INDEX|SENSEX

also in my python code I have included sleep(1) , so there is also 1 second pause between two call but I am still getting error

Even I am dealying 2 seconds I am getting same error, If there is any change on limit side , please let me know the threshold

This occurrence should not take place based on the implementation. Below is an example code accomplishing the same task that functions correctly.

Overall rate limiting:

  • 25 requests per second
  • 250 requests per minute
  • 1000 requests for 30 minutes

We are in the process of updating the documentation for the above.

import requests
import json

url = 'https://api-v2.upstox.com/market-quote/quotes?symbol=NSE_INDEX|Nifty%2050%2CNSE_INDEX|Nifty%20Bank%2CNSE_INDEX|India%20VIX%2CNSE_INDEX|Nifty%20Fin%20Service%2CNSE_INDEX|Nifty%20Fin%20Service%2CNSE_INDEX|NIFTY%20MID%20SELECT%2CBSE_INDEX|SENSEX'
headers = {
    'accept': 'application/json',
    'Api-Version': '2.0',
    'Authorization': 'Bearer your_access_token_here'
}

iterations = 0
max_iterations = 50

while iterations < max_iterations:
    print("Loop iteration -----", iterations)
    response = requests.get(url, headers=headers)
    data = response.json()
    print("Status Code:", response.status_code)
    print("Status Response:", data["status"])
    # Sleep for 1 second
    time.sleep(1)
    
    iterations += 1

You are making the calls (10 per/sec) within the threshold. Kindly provide the Python code that you’re utilizing for additional analysis. Sharing information about the application’s name, specifics, and timing would be beneficial in identifying the problem.

My python code , my application name is “upstox_new”

import requests
import pandas as pd
import cx_Oracle
from sqlalchemy import create_engine
from datetime import datetime
import schedule
import time

Read the bearer token from a file

with open(“C:\Users\raghv\PycharmProjects\Cokkie_fetch\venv\upstox.txt”, “r”) as file:
bearer_token = file.read().strip()

Connect to Oracle database and save data

cx_Oracle.init_oracle_client(lib_dir=r"C:\oracle\instantclient_19_10")
engine = create_engine(‘oracle+cx_oracle://upstox:upstox@localhost/orcl’)

def save_url_data_to_oracle(url, table_name, headers=None):
try:
# Fetch data from URL
response = requests.get(url, headers=headers if headers else {})
response_data = response.json()

    # Process data to DataFrame
    rows = []
    for index, values in response_data['data'].items():
        try:
            timestamp = datetime.strptime(values["timestamp"], '%Y-%m-%dT%H:%M:%S.%f%z').strftime('%d-%b-%Y %H:%M:%S')
        except ValueError:
            # Handle the case where timestamp doesn't have microseconds
            timestamp = datetime.strptime(values["timestamp"], '%Y-%m-%dT%H:%M:%S%z').strftime('%d-%b-%Y %H:%M:%S')

        base_row = {
            "instrument_token": values.get("instrument_token", ""),
            "symbol": index,
            "last_price": values.get("last_price", 0),
            "volume": values.get("volume", 0),
            "average_price": values.get("average_price", 0),
            "oi": values.get("oi", 0),
            "oi_day_high": values.get("oi_day_high", 0),
            "oi_day_low": values.get("oi_day_low", 0),
            "last_trade_time": datetime.fromtimestamp(int(values.get("last_trade_time", "0")[:-3])).strftime('%d-%b-%Y %H:%M:%S'),
            "total_buy_quantity": values.get("total_buy_quantity", 0),
            "total_sell_quantity": values.get("total_sell_quantity", 0),
            "open": values["ohlc"]["open"],
            "high": values["ohlc"]["high"],
            "low": values["ohlc"]["low"],
            "close": values["ohlc"]["close"],
            "timestamp": timestamp
        }

        # Extract depth buy and sell data
        for i, depth in enumerate(values['depth']['buy'], 1):
            base_row[f'buy_quantity_{i}'] = depth['quantity']
            base_row[f'buy_price_{i}'] = depth['price']
            base_row[f'buy_orders_{i}'] = depth['orders']

        for i, depth in enumerate(values['depth']['sell'], 1):
            base_row[f'sell_quantity_{i}'] = depth['quantity']
            base_row[f'sell_price_{i}'] = depth['price']
            base_row[f'sell_orders_{i}'] = depth['orders']

        rows.append(base_row)

    df = pd.DataFrame(rows)
    df['last_trade_time'] = pd.to_datetime(df['last_trade_time'], format='%d-%b-%Y %H:%M:%S')
    df['timestamp'] = pd.to_datetime(df['timestamp'], format='%d-%b-%Y %H:%M:%S')
    for col in df.columns:
        if df[col].dtype == 'object':
            try:
                df[col] = df[col].astype(float)
            except ValueError:
                # If a column can't be converted to float, it'll skip that column.
                pass

    # Delete existing data and insert new data into the provided table
    engine.execute(f"delete from {table_name}")
    df.to_sql(table_name, engine, if_exists='append', index=False)
except Exception as e:
    print(f"Error occurred: {e}")
    time.sleep(10)  # Sleep for 10 seconds before retrying
    save_url_data_to_oracle(url, table_name, headers)

headers = {
‘accept’: ‘/’,
‘Api-Version’: ‘2.0’,
‘Authorization’: f’Bearer {bearer_token}’
}

url = ‘https–:confused: /api-v2.upstox.com/market-quote/quotes?symbol=NSE_INDEX%7CNifty%2050%2CNSE_INDEX%7CNifty%20Bank%2CNSE_INDEX%7CIndia%20VIX%2CNSE_INDEX%7CNifty%20Fin%20Service%2CNSE_INDEX%7CNifty%20Fin%20Service%2CNSE_INDEX%7CNIFTY%20MID%20SELECT%2CBSE_INDEX%7CSENSEX%2C’
#save_url_data_to_oracle(url, ‘live_index’, headers)

Execute the SQL query to fetch data

sql_query = “”"
select INSTRUMENT_KEY from banknifty_symbol
union
select INSTRUMENT_KEY from nifty_symbol
union
select INSTRUMENT_KEY from finnifty_symbol
union
select INSTRUMENT_KEY from midcapnifty_symbol
“”"

result_df = pd.read_sql(sql_query, engine)
instrument_keys = ‘,’.join(result_df[‘instrument_key’].astype(str))
#print(instrument_keys)
final_url = url + instrument_keys
print(final_url)

def job():
save_url_data_to_oracle(final_url, ‘live_data’, headers)

Schedule the job

def run_schedule():
while True:
current_time = datetime.now().time()
start_time = datetime.strptime(“9:15:00”, “%H:%M:%S”).time()
end_time = datetime.strptime(“20:30:00”, “%H:%M:%S”).time()
if start_time <= current_time <= end_time:
job()
time.sleep(1)

Start the scheduler

run_schedule()

I posted code , but some how system flagged it, Today again I am getting error

market-quote/quotes?symbol=NSE_INDEX%7CNifty%2050%2CNSE_INDEX%7CNifty%20Bank%2CNSE_INDEX%7CIndia%20VIX%2CNSE_INDEX%7CNifty%20Fin%20Service%2CNSE_INDEX%7CNifty%20Fin%20Service%2CNSE_INDEX%7CNIFTY%20MID%20SELECT%2CBSE_INDEX%7CSENSEXNSE_FO|40392,NSE_FO|40396,NSE_FO|40397,NSE_FO|40398,NSE_FO|40399,NSE_FO|40400,NSE_FO|40401,NSE_FO|40403,NSE_FO|40406,NSE_FO|40407,NSE_FO|40412,NSE_FO|40413,NSE_FO|40416,NSE_FO|40417,NSE_FO|40418,NSE_FO|40423,NSE_FO|40424,NSE_FO|40425,NSE_FO|40426,NSE_FO|40427,NSE_FO|42529,NSE_FO|42549,NSE_FO|42550,NSE_FO|42571,NSE_FO|42572,NSE_FO|42577,NSE_FO|42578,NSE_FO|42584,NSE_FO|42585,NSE_FO|42589,NSE_FO|42590,NSE_FO|42593,NSE_FO|42594,NSE_FO|42597,NSE_FO|42598,NSE_FO|42601,NSE_FO|42602,NSE_FO|42603,NSE_FO|60463,NSE_FO|60464,NSE_FO|60465,NSE_FO|60466,NSE_FO|60467,NSE_FO|60468,NSE_FO|60469,NSE_FO|60470,NSE_FO|60471,NSE_FO|60472,NSE_FO|60473,NSE_FO|60474,NSE_FO|60475,NSE_FO|60476,NSE_FO|60477,NSE_FO|60478,NSE_FO|60479,NSE_FO|60480,NSE_FO|60481,NSE_FO|60482,NSE_FO|60483,NSE_FO|60484,NSE_FO|60485,NSE_FO|60486,NSE_FO|60487,NSE_FO|60488,NSE_FO|60489,NSE_FO|60490,NSE_FO|60491,NSE_FO|60492,NSE_FO|60493,NSE_FO|60494,NSE_FO|60495,NSE_FO|60496,NSE_FO|60497,NSE_FO|60498,NSE_FO|60499,NSE_FO|60500,NSE_FO|60501,NSE_FO|60502,NSE_FO|60503,NSE_FO|60504,NSE_FO|60505,NSE_FO|60506,NSE_FO|60507,NSE_FO|60508,NSE_FO|61521,NSE_FO|61524,NSE_FO|61525,NSE_FO|61526,NSE_FO|61527,NSE_FO|61528,NSE_FO|61531,NSE_FO|61532,NSE_FO|61533,NSE_FO|61534,NSE_FO|61537,NSE_FO|61538,NSE_FO|61539,NSE_FO|61540,NSE_FO|61541,NSE_FO|61542,NSE_FO|61544,NSE_FO|61545,NSE_FO|61546,NSE_FO|61547,NSE_FO|61548,NSE_FO|61549,NSE_FO|61550,NSE_FO|61551,NSE_FO|61552,NSE_FO|61553,NSE_FO|61554,NSE_FO|61555,NSE_FO|61556,NSE_FO|61557,NSE_FO|61558,NSE_FO|61559,NSE_FO|61560,NSE_FO|61561,NSE_FO|61562,NSE_FO|61563,NSE_FO|61564,NSE_FO|61565,NSE_FO|63604,NSE_FO|63605,NSE_FO|63606,NSE_FO|63607,NSE_FO|63608,NSE_FO|63609,NSE_FO|63610,NSE_FO|63611,NSE_FO|63612,NSE_FO|63613,NSE_FO|63614,NSE_FO|63615,NSE_FO|63616,NSE_FO|63617,NSE_FO|63618,NSE_FO|63619,NSE_FO|63620,NSE_FO|63621,NSE_FO|63622,NSE_FO|63623,NSE_FO|65319,NSE_FO|65320,NSE_FO|65321,NSE_FO|65322,NSE_FO|65323,NSE_FO|65324,NSE_FO|65325,NSE_FO|65326,NSE_FO|65327,NSE_FO|65328,NSE_FO|65329,NSE_FO|65332,NSE_FO|65333,NSE_FO|65334,NSE_FO|65335,NSE_FO|65336

Using this url , as we know we can fetch around 500 sysmbol I using this , but getting error too many reqesut

@Raghav

The 429 error you receive could be due to breach of one of the following buckets:

  • 25 reqs/sec
  • 250 reqs/min
  • 1000 reqs/30mins

In your case, I see your application breaching the 30mins count several times a day.

You might want to maintain within the threshold to avoid a 429 in future.

Thanks!!

Please add this in documentation, else we we think it’s 10 per second

1 Like

A new document has been released! Which now have the rate limit details

You can explore the details here:

I hope this helps.

hello Pradeep_Jaiswar,

can you increase the rate limit for me.
my User ID - 349630

@Aniket_Tawade

Thank you for your inquiry. I’m sorry for any inconvenience, but we maintain consistent rate limits for all users to ensure fairness and optimal service performance. We’re unable to offer custom rate limits on an individual basis.

Thank you for your understanding.

Thanks!!