Google BigQuery Connector
Google BigQuery Connector Help
Version 8
ZappySys Logo File Version: v8
Supported Engine: 9

Google BigQuery Connector - Source Code


<?xml version="1.0" encoding="utf-8"?>
<ApiConfig Name="Google BigQuery" Desc="Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector"
Slug="google-bigquery-connector" Id="581fed19-75c7-4b95-9374-5035954e730f"
Logo="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJYAAACWCAYAAAA8AXHiAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAABeRSURBVHhe7Z1pcFNXlsfpSU1PaqqmM12Vmur5MOmqqekPXTVTna6ZhBgv2mwCxiFsDgWBsKQDgSyEsHgBW7YWbxgMGIwdzOo4BLMYAgQSEiBpQiAQSCCBYLwveF8ka5fsM/c8XceLruRNy3vy+1f9CmO996R3z1/nnHvfkzxJlChRokSJEiVKlChRokSJEiVKlChR3tMHX9qOvP8hYp+wqEttR45dd6ylQyJqLProqmVmyknH+fm7HBCm7gGJpgekWhEcC2laDyhPOGBZvnURHS5Rw2nJgaoni646IFztAEV6j4gH0GjTsnpg2znb/9HhE+VOkRm2Khl5R7IGUoRNJGFyimM+HUJRAxVbAk+EptrL5WkOEBk9Eo0D0k7bywHgCTqkomIL4KlwFXvAREYOmisizfYcHVZRiUdt7ayBEhk92JsmHrNPp0M7cbXqoPW1CDLjYw2SyNiI2Wr7nA7vxFW4ynaRNTgiYydCTWaK5+2FdIgnno5+a2+MIH0Ba3BExsereQ6gwzyxpDpu+ZNUY7ewBkVk/Ei1Dli133aEDvfE0dzt1iwZY0BEvAdp5HWzt3T/Gx3y4NfZ27bkUG55wS7iQ2RaO2hLrROnJK4+YG9hDYSI98H1wdQT5tV06INXiUdt70o07EEQ8Q3RW+xX6fAHp1YUwD+Gpjrus05exHeEq+1w+O/2GzQMwafTt+wmKan7rJMX8S2zcmw9BZf1T9NQBI92f26ThKlszJMW8T34hl5fbDtNwxE8Iu+YvXiCijSbSIAITbXBvF3m/6IhEb5e3dH2uzCVaCo+oDppD57lh9f2WNtZJynifyLUVth5wZpGQyNcbT1nS5Bp2ScpEhimptt+ouERppYX9v5LSIrVzDo5kcAh1dhgZrbhrzRMwtOUVLMMTyRShHfItbbe4zdNoTRUwtHRa9ZX5For86REAg++4d8rcpyj4RKOYraYTzlPAs0lwkfCUy3w1n7r/9CQ8V+xSvithMw+WCcjwi9Cks05NGz81lf3e/99To7FoGCchAj/kJF2pfiqZR8NH3+VVmpVK7jeSkQoSDWWprgSeIqGkH9SlsK/hpG6zXrxIvwFJ1nTt5jlNIz8k0xrfkcsgcIEV+Q/v+dYTEPJH52/Y1sZlS7OAoUKti8rCi2XaTj5IQD4TXSG6aZCi2VQRKhEqCzw/kdW/qzI4xd6yDTsFysiLF5QWi7QsAZeM7IwnbJfqIiwwKpz9ntbKQ1t4JTwsSVXLIHBhURjg+0X4RkaYv9rx6e9v5OoTcwXJyJsFGrTEhpm/2t6hqVAzFbBCw2z/zUtM/h7K3zjDIW1XbCB57k4z/w9DbV/9Mkt+OeYLFOLQmsmLyI4kJNzkWvMoCA/R2eaYd52CyzNt8C7hyywvtjKsYb8vLzAAq/ssEB0lhmi0p374L6sYwodGTm39FN+vCEwSmN4FgPAejFCQ6q2QGiKBd7cZ4HTt+zQ0NELRiuA1Q7g6AHo6QXopeDP+Dt8zES2adX3wvkf7LDuQ7zeZoZwFfs5hExEqqmeht33mrUV0yX7hQgBfFNI1GZ4+6AFLvxoh8bOXughhhmr0HTt3b1w5b4DEj62cu90Obeux35+oXHoa+snNPS+05qD5iPOPoP9IvgMlm4sX2sOW+FWZQ9nCF/oYWMPKI9Z4cUMS3CUSa0Jjlyx/JlawPs6cBmeFHJf9TLJtHeqx5GaRqmq1l5YsAt7FeFnL0mqOYXawPuavc38jRCNJSWBLbhkB73JRynKgyykHzt508F8XUJiKsn01Abe1cI8+P20DPaT8hV8E8zOMcNXDxw0zIHTD9UOWJQn3NKIY7kg11R1mVQtaonx6+KtjqdmbcUVdvaT8hEcCCx91aQcjVQ469OZeqCqxQ6f3THCjrNdEFfUBm990AqrC1ph4+E22PZJF5y5aYSKJjt0Gnq4WeJI1W7A0ihcc8lJb516zDaZ2mL8mrvdtEihQWMJAwVpNueQTNVGZmoj1fUyC7y3rwUUyscg2dwAEZvqIXwT/suAPi5Pfgxv7GmBiz8YRzwRMNsA3jpgIX0X+7XzHZz8UFuMX/O2o2PZT8RHFu22wOPO4SNtsfVC6Q0DLN7RwhkmLLGBM9VoQPOFJtbDvKwmOHxFT/q44VNYp7EX3j6IN9exXz+fkWuMQKrX+P+c3bIPLEfkAhqAaRkmuN/gObiYXcoe2+CNvBZ4Ia6eaZixEBLfAPOzm+D7CuuwJRKNPzNbWG/YPqIzTXDmtvUv1CJj07QMYy/r4HwkQmWGG+XDN+onvzXAlATvGWook4lZsT8bTvUdPSQDGJjnwnciUg1j/w75pfkmHddbYcbiOTK1CVQnrWD34CvMVGnHO7nSxTKEN8Gy+t7+NtJTuS/J+Hr2f9kNUlU385z4zIukMlCbjE5v7jeF4wVW1kH5BvYq83daPJYfDGLK0Q4ITWAbwRegubDcGizuzYV6s6CNvDGEZS4c89idVh21y8g1fweayigIwlNNcPa2nYbJVXgdMO1EJwl0HQk4Ziv/EU5mj6s/aAOb3b25bpZb4IX4eojEssg4P76C1ez1fOMcapnhNT29ew2WFtbB+MiKQs/Z6tz3JphCAscKvD8II6V3zwX3PRdm06Qj7dy2il8zF/tc+cbM7FGUxNjtwjkxzFaXfnLfWJU32iBSyQ64P8F1r2sPLfRVueqXehuZVdaBNOkxMZeeea58BNfjFucZM6h13GvtIfM9hYZ9ED4yPcMI3Wb3Zeb9A21cUFnB9jcLc5q47MSSzdEL87c2ca9VmtRAyoxwymJMlgmuPrQ+Sy3kqvm7Tf9B6ryDtTMfkZE3wMGv3fdWd6utXI/DCnIgwB7vzE0DfXWuuvKTCcK5GasToZgLE9GsrYZ71EauWrUf73Vm78xHotKM0KJnN1eYAZbsdGaAgcEdKeGb6mBVQQvX+wxkAck6YzWr8zJQPXc9kiXsE2O0eEnJuT2XubiyyP/WZCqJxcJdpj9SK/Ur4ahtlUQlrFnJK2TmircJs1TXZh9XtpqSUAdX75vo0fqV+2kXhJLHWPuMhBAyibhRZqZHc9U7ha2D3gzOngsbevYY8Iml+Yw/Ybd4t7BMJVcbIOmYlYbDVfu/0I2rt0Jjff2z942FKD9up0dz1f4vddxzD9qHZC4hLEXISExeL7CspZaaNGldsTUTfxmpFQ6YXU9+x05XWFKcjTAGaGx4NlYtc5+RMk3VAEY3i6bfPTLDC3Gux5cm4VJE32yRPSZ8YNHuAcsPMzIND1gb8RnMWGWN7F4FyyBO3YcGZzT40liTiXFuV7CXHtpIz4h9GGu//p6LPSZ8AC9PqU5Y5nHGis5kb8RnpmeYwGhlv+uvPzTD8xvHF3xfGgsz0vFr3fSIg4U3G87OwAaevS+XuTTYc7HHhQ/Itaazk87/YK92TmuFxdwc16D36dwtA0yJ56+xwhNrYefZTnpEVy3f1czcrw8pgc/mWphrME1aWYhTWfYGfObVXe6NVXRFT4LHDgqyck8zvJk/mBkavHDcv40vjYW9n6rEfQO/Zl8Lc7+B9JdF/vVcL2cbYVIkz9OqO5blu5+y771IjMXNCF0DEqmsYy5RbC5up7NI53a+NBaSUNRKj+iqDYdamfsMha8919Q0AzWWAFm6x33GKrzYxZUb6WZX0Fh6xiUgDHTEpv7t0DwsY+36tBPCyGMDjzkWPBlr46EW5j7uiNQQczHGKFBEabth0qyt7Af5ziIPpfDjv2MpdBMEHxhLQnghroZryvupAcmA4w0En0d7vIMe0VXv7G1m7ucOWRLpudRdzozBGCt/M4NMBiftu2xxNoICY/Y295+MwY9uhZDAsoLgyVgDjTAaY72YWgcX7hjgix+Ng5if3TBouz7Q9AWfub+NBi9FsfbzhIxb59Ixx8rfLNhJjIWKYjzId6alG6DLwF7Hul1pgckbRm+ssWasGWrs21yPuSy3cdB2fWBGO3uTvdxgcwDEaOqZ+w0LyVx8KIsRKfqznLEW5JKZIWMDPoMLcXdr2cZq1fW47YP4YCzMpr/Usy9HNXT0DHodoyapNqCZS67uhs1HTU5jLcztPstd6GRsyFckqXoovmqj4RgsLJEr9zQxexxflMLRGmtu5mPu7guWLt8zuS3jI8VZFknPxRg3XzM9g5bBPk1LZ2/IV/Cdsa7IfZ91/rbBrbG6GdfpfJGxlu50NRa+ptxz7vurnec6yXOPz1gINvSRGv9nLoXWuIpayimZqnttlFZPporCYfZWA1jYSQva9A6QbsIADUZCfnenyvU6XUJR3xTfuV1ofI1bY4UmVP+6HRKtcjVWRZONzBb7t+nj+Q3V8MBNGUQt57Kc635jIqkGIklZZI2dLwhN7oJY5U+/pZZyqviK5U8vphmYO/CXbqhsZt/vjtfc4g61kBnY4MFGY83U1kOXcXB/hsZyZiznduMxFn4i59Vtj7nnGrhdBPl/bFY9M7uhTNZemJqCl2wG7zceZFzP1ckYO++Ck4Z1h41vUzsNVrS644+snfiKgpyMptT9BxQaO3uYgcJMgkYyWPrNhf93lk7nNmM1Fn4fhLoE77Pvf3wgP9e6z1YfcpeiBh/bG3BlkfRcrDH0BpGEl7IMncWX9U9TK7kqPKXrHmtnvkKmttCiY2cA1NbT7S5ZCwlLqILUo21gx9RG5I2MhfeB7TjTwR174ON9vLO3kR7FVXh/VlQK9mCu+3kFkrl8VRZlKt3wHwHTlBqelfuxLo8XmVoPx751nwXw/vJ5WbhQ6TrYmFVWFTRxpWtTcV/z7nxsNMbqa97XH0Rz9v9+ILiI+rjD/cfUvntkIcdl7+s1uLLo/cwlU3cVUft41sq9hmtYM1kH4SOv7DAQA7nPWvhpHXdZBLNZLDHewq0Ng7LFaIylUNbCArI/KzMiIeRYuDLvTmjK5bmPmft6G2dZ9F7PJU3VQe5584vUOsML+xfWgfiITKWHvItupodUF+4YQcYYaHeMxlieQLN6unyDOnWjG0LivN9buQMbem+URUw+uz+3PKKWGZkitZ1zcB0kSiDEZOo9fnAVdfSqHkI2VpLBxSB6JmJTNSR91AaX7ppIxrPAvRoLfPmjEd7+AO+lZ+8zlND4Kth+poubobqTjfRls9NxgsE+hi/5taFnjOdICEnq6p62o+yfqGVGpl2X4Q9kmmpmHZCP4GWMpXuMbj+k0Ce8uxSNIRmBOSISq0gJxX8H/8zadihYeg9d6nK7gIuyO3oh/jAuiWCZZh/Hl8g2Y+bqYI7ncMhJw77usG4ZtcvodP5H+xEhZS0018ErVo8ZAoVT/vnZ9SM2yWjAY8ZoauHSPdcyOlQP6m2cWVnH8RfOskh6LsZ4emLBTsP4/hTKlKTOI6wD85UwZRdcvOvh29cGKOeTDpi8gZRGzF5eAI+VQDLQSIVfr5RY1EJKJvt4/iRKQ8oiYzxZRKR0wRc/2z+kFhmb1h42/EWSwn4CvhKdoYdrZcObCxNbVbMdtpR2gCKZzN6w/2IMuifQFBFkNpj0UQs8qBv+u0eHCr+BMPNEG0yJ857BxwRp6keauaZqu7KoPcanNYeMn7GegK9g+Z6eroPvKkaWuVAtOgccv6aH9QeaYE5GHchJmcA+CRvwgYQRI+Hscqa2Dt4tbILDl3VQ0+r+i0lGIlxHyyDmwuMzg+4nnD2XZ3OFKXWQesL4HLXG+PT1L/ZZmP5YT8RnsCxe+sk+bM/FUouuB779xcyZDT+uv49Q8o0evnlggsftIzfsSIWZC8tiOK63MYLuN8ibxl1ZxB523yVLFbWFdzQ3RzeZW7XlnlQ44GvOPmPkyl4ghUsdlc2e19qwjGpKWmlZRIMFDudscfBYhio7bsfGwhPUEt6R8gA8Ganuqhr6ZEJAruqEFXktZDboObC+EH7cf8PBJpKJKmFeJl7W8Vw2sSxmlfb1XOyg+wMs95GqfnPJVF2wel/3AmoH7+rYDXsuPsHAoAmFqVodSfO1kH+hE1pJP+XrDIa35ZR8000CVAUSXKciwcK1sMjkajJh8GxwvLs05UgrRHBlMXBw5uJ6ri54bbe+gtrANwpJak8ZGjQhgV+TLU0sh8KLOjC7+e6H8Qg/DHHqejcokqq5LMUK2FRlNTS0e85cuBSBM00+lMXQpA44dt2WSC3gG63aq/9zeIrTxYKE9FyypAYy46uEaBXel9UMX94dfsXek7B8XSONvupoK7ycVksMVQESRoD6wAwWm1UL9cOYC1fms04Gtizia1UkV26g4fet1hd1fxLJCpqAwA8c4MBhecL1q2hVLemFmqH4qy4y8zNztw9jVmnvdnC33uAf0cSfsUfCv8OD38qHpW5zcQvM1NZCaFy589LMkMC4A5/3JU0NVLcMUxaJadOOt3Hbs47ja8LiK41xhxqeoaH3rU58Z/3vqWmdPayACQcdl7mGDiQGcMrGR/Dc+grOcLi2hNkHwdIWsoE8tq6c+1eSOL5MwmUDUjKxwfckOymLK/Iaf+3V/AUufWSfar5Lw+4fzc02/FVGZltRGgGDZTEZMxcaJDCgOeek1w5rrsyT7RAeX8E8hq9Ysr1++DtDfaHo9K7vmQETGIE2VwQx19wM9z0XLsrKkzCLsPf3BaQEwgxV5Uwaav9q1+e2xIgUXOtgB0w4YOaqYw6wv8DMpSDmGdpzoalC/ZypkCU51eU0zIFR9lkzI1DCJOBlkTBDXfNr5rr+0ASyzextfUkImYXmnGkNTLbq0+IC0zPhymDIWgQ1miuwmQvLIs4W39/fCFHJOGlgb+cr8Pm3nW4NTG81VOuKuku5D0WygiVAAm2uQCLbXNWkVMI/0NAGViUAT8zf0dXDCpIwCfxsMRDgwm5K8eMzNKz80OEr+gvSVCyJwYPTXNg4TwxW5jXwowQOVXRGx2VWgISMLLmWGYRgY0pcBczQ1oTTUPJLmaWmxdJULCXsIAmTvp6LHZBgQJJYAQu2VHv3Bj5vK+dTMyi4m8SCi2Aui5Fk9qksbvpPGkJ+Kjaj46kIZXsTKziChrxZgjFzYcMef7DhPg0fv7Vqb/c+BXf3YbCBZTG4ei755opqGjZh6G8F3YzABAPB03OFxVfAun0NW2jIhKHt581bJNx1xCCEK4vYc5XTIOG/QqMCXs+t5efywnB6KaPtXJSmnQQjOFEk1TACJgzCEx7BO/l1MhoqYWn9h/pQaUpHLysowUEHyLmeix08voLLC0tyaitpmISpzFMmY6S6jRGU4EEhIHPJEh9BtKpSmCVwqF7JwUVTdlCCBaFkLiyBq/bUltDQCFtLdnWmKVTsgAQPHYLouWanVQVHturT63mdMFUT3CUR4XNZjIh/BCv31LK/i12oSj5qWCJRtnHmCmaiSD+JX4mEvQyfkBKWba8JrmzVp5j09tM48KyABBNRBPxYFyvAgSIqqRxWZDe4/4J/IWtFgf7p6PS2nqloriCHK4s8MZeENOwxmqqvaBiCU+8d6LgRmcoORrDBlUUemCs6tSI4S+BQLdxBGvkhQQhWAm2ucNKwL95anUyHPrgVk9mxOkrVygxEMBLIshijmiDZqk+LcruYQQhWApG5pKS3WlNYH9jPB/pb7x40/u80DWatiUMUwWmuMp8jTSiDBdmVjXS4J5b+lq9rjORK4gSCW4qoYprBm0Qll0+sEjhQJSXwRHQ6Y/CDHT+YS574aD8d5omp5fm6wshUxuBPAHxVFmNUlXD2huFZOsQTV87lB/bgBzvezlyS+DKYpS2fQ4d2Yuu1PN10eUoLc+CDHi+WRWzYY9RVDXRYRaEUqa3PyVIYAz9B8EZZfP79X65PAvgNHVJRfZIpWxe+yC1BYPaaeDjN9XDURMQ/hNCNZefoMIpiSX28a/6UTc23I1XswQ9qyJvKWRbZBhqKNOEhKDaXwaJtNbfp8InyJPx+puPXdOXcgmJKMzsIQcxIzDV5/UN4K78OZqTX/L6kpMS7f99mIiiuqGPRlM2ti7Qnu+GtvR2wsmACkN8Ob+xuhKXbqwextrAeUj9uuilPKFukLGn+Ax0iUaJEiRIlSpQoUaJEiRIlSpQoUaJEjUmTJv0/jN5iAMBRgJYAAAAASUVORK5CYII="
           TrustedDomains="cloudresourcemanager.googleapis.com"
  Version="8"
  EngineVersion="9"
  >

  <VersionHistory>
	<Change Date="2024-01-02" Type="New">Added Project / Dataset Dropdown for easy selection on connection screen</Change>
    <Change Date="2023-04-11" Type="Fix">Location option doesn't work because the child endpoint in the get_query endpoint didn't pass the same location in the URL</Change>
    <Change Date="2023-03-17" Type="New">Added Job Location option for Query / Connection Level</Change>
    <Change Date="2023-01-05" Type="Fix">Duplicate rows returned in some cases</Change>
    <Change Date="2023-01-05" Type="New">Add support to call stored procedure without causing duplicate API calls (CRUD Operation)</Change>
    <Change Date="2023-01-05" Type="New">Add request caching logic so only one job request is fired</Change>
    <Change Date="2022-10-13" Type="Modified">Added retry logic on error code 429 (Too many requests).</Change>
    <Change Date="2021-10-26" Type="New">Added MaxRequestSize to auto adjust Request to avoid going over 10MB data limit for insert operation</Change>
    <Change Date="2020-05-15" Type="New">Added support for Service Account Credentials (Using P12 key file)</Change>
    <Change Date="2020-05-15" Type="Fix">UseCustomApp option doesnt work</Change>
    <Change Date="2020-01-27" Type="Fix">DateTime type is not detected correctly</Change>
    <Change Date="2020-01-12" Type="Modified">Added 30 Seconds Cache for get_query endpoint so same query is not executed twice (one for Metadata, one for Data)</Change>
    <Change Date="2020-01-12">Initial version</Change>
  </VersionHistory>

  <ServiceUrls>
    <ServiceUrl Name="https://www.googleapis.com/bigquery/v2" Url="https://www.googleapis.com/bigquery/v2" />
  </ServiceUrls>
  <Auths>
    <Auth Name="OAuth" Label="User Account" HelpLink="https://zappysys.com/links/?id=10002"
          Type="OAuth" ConnStr="Provider=GoogleBigQuery;"
          TestEndPoint="list_projects">
      <Params>
        <Param Name="UseCustomApp" Value="True" Required="True" Hidden="False" Desc="Use your own app credentials or inbuilt app provided by ZappySys for ease of use. If you choose UseCustomApp=true then make sure to obtain your own ClientId and Secret using steps provided (Click [Steps to Configure] link found next to Authentication Type dropdown)" />
        <Param Name="ClientId" VisibleIf="UseCustomApp" RequiredIf="UseCustomApp" />
        <Param Name="ClientSecret" Secret="True" VisibleIf="UseCustomApp" RequiredIf="UseCustomApp" />
        <Param Name="Scope" Value="https://www.googleapis.com/auth/bigquery https://www.googleapis.com/auth/bigquery.insertdata https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/cloud-platform.read-only https://www.googleapis.com/auth/devstorage.full_control https://www.googleapis.com/auth/devstorage.read_only https://www.googleapis.com/auth/devstorage.read_write" />

		<Param Name="ProjectId" Label="&#009;ProjectId (Choose after [Generate Token] clicked)" Value=""
			   OptionsEndPoint="list_projects" 
			   OptionsEndPointValueColumn="projectId" 
               Required="True" Desc="Login to https://console.cloud.google.com/bigquery and choose Project dropdown at the top to see list of Projects. Over there you will find ProjectID next to ProjectName. You need to get ProjectID which has BigQuery API support enabled."
               />

        <Param Name="DatasetId" Label="DatasetId (Choose after [Generate Token] clicked and ProjectId selected)" Required="True" 
			OptionsEndPoint="list_datasets" 
			OptionsEndPointValueColumn="datasetId" 
			OptionsEndPointParameters="ProjectId=[$ProjectId$]" Value="" Desc="Default Dataset Name you like to use when listing tables (e.g. MyDataset)." />

        <Param Name="RetryMode" Value="RetryWhenStatusCodeMatch" Options="None;RetryAny;RetryWhenStatusCodeMatch;" Hidden="True" />
        <Param Name="RetryStatusCodeList" Value="429|503" Hidden="True" />
        <Param Name="RetryCountMax" Value="5" Hidden="True" />
        <Param Name="RetryMultiplyWaitTime" Value="True" Hidden="True" />   
		<Param Name="Location" Template="LocationParam" />
      </Params>
      <Notes>
        <![CDATA[This connection can be configured using two ways. Use Default App (Created by ZappySys) OR Use Custom App created by you. <br />
To use minimum settings you can start with ZappySys created App. Just change UseCustomApp=false on the properties grid so you dont need ClientID / Secret. When you click Generate Token you might see warning about App is not trusted (Simply Click Advanced Link to expand hidden section and then click Go to App link to Proceed).
<p />
To register custom App, perform the following steps (Detailed steps found in the help link at the end)
<ol>
  <li>Go to <a target="_blank" href="https://console.developers.google.com/">Google API Console</a></li>
  <li>From the Project Dropdown (usually found at the top bar) click Select Project</li>
  <li>On Project Propup click <b>CREATE PROJECT</b></li>
  <li>Once project is created you can click Select Project to switch the context (You can click on Notification link or Choose from Top Dropdown)</b></li>
  <li>Click <b>ENABLE APIS AND SERVICES</b></li>
  <li>Now we need to Enable two APIs one by one (BigQuery API and Cloud Resource Manager API).</li>
  <li>Search <b>BigQuery API</b>. Select and click <b>ENABLE</b></li>
  <li>Search <b>Cloud Resource Manager API</b>. Select and click <b>ENABLE</b></li>
   <li>Go to back to main screen of <a target="_blank" href="https://console.developers.google.com/">Google API Console</a></li>
  <li><p>Click <b>OAuth consent screen</b> Tab. Enter necessary details and Save.</p>
      <ol>
          <li>Choose <em>Testing</em> as <strong>Publishing status</strong></li>
          <li>Set application <strong>User type</strong> to <em>Internal</em>, if possible</li>
          <li>If <em>MAKE INTERNAL</em> option is disabled, then add a user in <strong>Test users</strong> section, which you will use in authentication process when generating Access and Refresh tokens</li>
      </ol>
  </li>
  <li>Click <b>Credentials Tab</b></li>
  <li>Click <b>CREATE CREDENTIALS</b> (some where in topbar) and select <b>OAuth Client ID</b> option.</li>
  <li>When prompted Select Application Type as <b>Desktop App</b> and click Create to receive your ClientID and Secret. You can use this information now to configure Connection with UseCustomApp=true.</li>
</ol>
]]>
      </Notes>
    </Auth>

    <Auth Name="ServiceAccount" Label="Service Account (Using Private Key File)" HelpLink="https://cloud.google.com/docs/authentication/production#create_service_account"
              Type="OAuth" ConnStr="Provider=GoogleBigQuery;UseCustomApp=True;UseClientCertificate=True;CertificateStorageType=PfxFile;GrantType=ClientCredentials;CertificatePath=[$PrivateKeyPath$];"
              TestEndPoint="list_projects">
      <Params>
        <Param Name="ClientId" Label="Service Account Email" Desc="This is service account email ID (e.g. some_name@my_project.iam.gserviceaccount.com)" Required="True" />
        <Param Name="PrivateKeyPath" Label="Service Account Private Key Path (i.e. *.p12)" Desc="File path for p12 file (i.e. Private Key file for service account). Keep this key file secure" Required="True" />
        <Param Name="Scope" Desc="Permission(s) you like to request" Value="https://www.googleapis.com/auth/bigquery https://www.googleapis.com/auth/bigquery.insertdata https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/cloud-platform.read-only https://www.googleapis.com/auth/devstorage.full_control https://www.googleapis.com/auth/devstorage.read_only https://www.googleapis.com/auth/devstorage.read_write" />

        <Param Name="ProjectId" Label="&#009;ProjectId" Value=""
			   OptionsEndPoint="list_projects" 
			   OptionsEndPointValueColumn="projectId" 
               Required="True" Desc="Login to https://console.cloud.google.com/bigquery and choose Project dropdown at the top to see list of Projects. Over there you will find ProjectID next to ProjectName. You need to get ProjectID which has BigQuery API support enabled."
               />

        <Param Name="DatasetId" Label="DatasetId (Choose after ProjectId)" Required="True" 
			OptionsEndPoint="list_datasets" 
			OptionsEndPointValueColumn="datasetId" 
			OptionsEndPointParameters="ProjectId=[$ProjectId$]" Value="" Desc="Default Dataset Name you like to use when listing tables (e.g. MyDataset)." />

        <Param Name="RetryMode" Value="RetryWhenStatusCodeMatch" Options="None;RetryAny;RetryWhenStatusCodeMatch;" Hidden="True" />
        <Param Name="RetryStatusCodeList" Value="429" Hidden="True" />
        <Param Name="RetryCountMax" Value="5" Hidden="True" />
        <Param Name="RetryMultiplyWaitTime" Value="True" Hidden="True" />   
		<Param Name="Location" Template="LocationParam" />
      </Params>

      <Notes>
        <![CDATA[Use these steps to authenticate as service account rather than Google / GSuite User. Learn more about service account <a href="https://cloud.google.com/iam/docs/service-accounts" target="_blank">here</a><p />
Basically to call Google API as Service account we need to perform following steps listed in 3 sections (Detailed steps found in the help link at the end)<p />
<h3>Create Project</h3>
First thing is create a Project so we can call Google API. Skip this section if you already have Project (Go to next section)
<ol>
  <li>Go to <a target="_blank" href="https://console.developers.google.com/">Google API Console</a></li>
  <li>From the Project Dropdown (usually found at the top bar) click Select Project</li>
  <li>On Project Propup click <b>CREATE PROJECT</b></li>
  <li>Once project is created you can click Select Project to switch the context (You can click on Notification link or Choose from Top Dropdown)</b></li>
  <li>Click <b>ENABLE APIS AND SERVICES</b></li>
  <li>Now we need to Enable two APIs one by one (BigQuery API and Cloud Resource Manager API).</li>
  <li>Search <b>BigQuery API</b>. Select and click <b>ENABLE</b></li>
  <li>Search <b>Cloud Resource Manager API</b>. Select and click <b>ENABLE</b></li>
</ol>

<h3>Create Service Account</h3> 
Once Project is created and APIs are enabled we can now create a service account under that project. Service account has its ID which looks like some email ID (not to confuse with Google /Gmail email ID)
<ol>
  <li>Go to <a target="_blank" href="https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts/create?supportedpurview=project/">Create Service Account</a></li>
  <li>From the Project Dropdown (usually found at the top bar) click Select Project</li>
  <li>Enter <b>Service account name</b> and <b>Service account description</b></li>
  <li>Click on <b>Create</b>. Now you should see an option to assign Service Account permissions (See Next Section). </li>
</ol>

<h3>Give Permission to Service Account</h3> 
By default service account cant access BigQuery data or List BigQuery Projects so we need to give that permission using below steps.
<ol>
  <li>After you Create Service Account look for Permission drop down in the Wizard. </li>
  <li>Choose <b>BigQuery</b> -> <b>BigQuery Admin</b> role so we can read/write data. (NOTE: If you just need read only access then you can choose BigQuery Data Viewer)</li>
  <li>Now choose one more <b>Project</b> -> <b>Viewer</b> and add that role so we can query Project Ids.</li>
  <li>Click on <b>Continue</b>. Now you should see an option to Create Key (See Next Section).</li>
</ol>

<h3>Create Key (P12)</h3>
Once service account is created and Permission is assigned we need to create key file. 
<ol>
  <li>In the Cloud Console, click the email address for the service account that you created.</li>
  <li>Click <b>Keys</b>.</li>
  <li>Click <b>Add key</b>, then click <b>Create new key</b>.</li>
  <li>Click <b>Create</b> and select <b>P12</b> format. A <b>P12 key file</b> is downloaded to your computer. We will use this file in our API connection.</li>
  <li>Click <b>Close</b>.</li>
  <li>Now you may use downloaded *.p12 key file as secret file and Service Account Email as Client ID (e.g. <b>some_name@some_name.iam.gserviceaccount.com</b>).</li>
</ol>

<h3>Manage Permissions / Give Access to Other Projects</h3>
We saw how to add permissions for Service Account during Account Creation Wizard but if you ever wish to edit after its created or you wish to give permission for other projects then perform forllowing steps. 
<ol>
  <li>From the top Select Project for which you like to edit Permission.</li>
  <li>Go to <b>IAM</b> Menu option (<a href="https://console.cloud.google.com/iam-admin/iam" target="_blank">here</a>) 
  <br />Link to IAM: https://console.cloud.google.com/iam-admin/iam </li>
  <li>Goto <b>Permissions</b> tab. Over there you will find <b>ADD</b> button.</li> 
  <li>Enter Service account email for which you like to grant permission. Select role you wish to assign. </li>
</ol>
     
      
]]>
      </Notes>
    </Auth>

  </Auths>

  <Template>
    <EndPoint Name="Paginate">
      <Params>
        <Param Name="NextUrlAttributeOrExpr" Type="Property" Value="$.nextPageToken" />
        <Param Name="NextUrlSuffix" Type="Property" Value="pageToken=&lt;%nextlink%&gt;" />
      </Params>
    </EndPoint>
    <EndPoint Name="PaginateRows">
      <Params>
        <Param Name="NextUrlAttributeOrExpr" Type="Property" Value="$.pageToken" Hidden="True"/>
        <Param Name="NextUrlSuffix" Type="Property" Value="pageToken=&lt;%nextlink%&gt;" Hidden="True"/>
      </Params>
    </EndPoint>
	
	<Param Name="LocationParam" Value="" Label="Job Location" 
		Options="System Default=;Data centers in the United States=US;Data centers in the European Union=EU;Columbus, Ohio=us-east5;Iowa=us-central1;Las Vegas=us-west4;Los Angeles=us-west2;Montréal=northamerica-northeast1;Northern Virginia=us-east4;Oregon=us-west1;Salt Lake City=us-west3;São Paulo=southamerica-east1;Santiago=southamerica-west1;South Carolina=us-east1;Toronto=northamerica-northeast2;Delhi=asia-south2;Hong Kong=asia-east2;Jakarta=asia-southeast2;Melbourne=australia-southeast2;Mumbai=asia-south1;Osaka=asia-northeast2;Seoul=asia-northeast3;Singapore=asia-southeast1;Sydney=australia-southeast1;Taiwan=asia-east1;Tokyo=asia-northeast1;Belgium=europe-west1;Finland=europe-north1;Frankfurt=europe-west3;London=europe-west2;Madrid=europe-southwest1;Milan=europe-west8;Netherlands=europe-west4;Paris=europe-west9;Warsaw=europe-central2;Zürich=europe-west6;AWS - US East (N. Virginia)=aws-us-east-1;Azure - East US 2=azure-eastus2;Custom Name (Type your own)=type-region-id-here"
		Desc="The geographic location where the job should run. For Non-EU and Non-US datacenters we suggest you to supply this parameter to avoid any error." HelpLink="https://cloud.google.com/bigquery/docs/locations#specifying_your_location"/>
		
  </Template>

  <EndPoints>
    <EndPoint Name="get_query" IsSqlEndPoint="True" Label="Read Data using SQL Query -OR- Execute Script (i.e. CREATE, SELECT, INSERT, UPDATE, DELETE)"
              Group="Query"
              Desc="Runs a BigQuery SQL query synchronously and returns query results if the query completes within a specified timeout"
              HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query"
              Url="projects/[$ProjectId$]/queries" Method="POST" Filter="$.jobReference.jobId"
              ContentType="application/json" NoOutputColumns="True"
              MetaDetectionOrder="Static" MetaScanRows="10000" CachedTtl="30" UseRawCache="True" 
              OutputDefaultColumnIfNoMeta="True"
      >
      <Body><![CDATA[{"query":"[$Sql$]","maxResults":10,"timeoutMs":[$timeoutMs$],"useLegacySql":[$useLegacySql$] , "location" : "[$Location$]" }]]></Body>
      <Params>
        <!--Param Name="ProjectId" Label="&#009;ProjectId" Value="" OptionsEndPoint="list_projects" OptionsEndPointValueColumn="projectId" OptionsEndPointLabelColumn="name" /-->
        <Param Name="Sql" IsSqlParameter="True" Label="SQL Statement (i.e. SELECT / DROP / CREATE)" Editor="MultiLine" Required="True" Functions="JSONENC"
               Value="SELECT title,id,language,wp_namespace,reversion_id ,comment,num_characters FROM bigquery-public-data.samples.wikipedia LIMIT 1000"
               Options="Example1=SELECT title,id,language,wp_namespace,reversion_id ,comment,num_characters FROM bigquery-public-data.samples.wikipedia LIMIT 1000;Example2=CREATE TABLE TestDataset.Table1 (ID INT64,Name STRING,BirthDate DATETIME, Active BOOL);Example3=INSERT TestDataset.Table1 (ID, Name,BirthDate,Active) VALUES(1,'AA','2020-01-01',true),(2,'BB','2020-01-02',true),(3,'CC','2020-01-03',false)"
               />
        <Param Name="useLegacySql" Label="Use Legacy SQL Syntax?" Options="false;true" Value="false" />
        <Param Name="timeoutMs" Value="90000" Label="timeout (Milliseconds)" Desc="Wait until timeout is reached." Options="false;true" />
		<Param Name="Location" Template="LocationParam" />
      </Params>

      <EndPoint Name="check_job_status" Url="projects/[$ProjectId$]/jobs/[$parent.jobId$]?location=[$Location$]" Method="GET" Filter="$.status.state--FAST" MaxRows="1" NoOutputColumns="True" >
        <Params>
          <Param Name="EnableStatusCheck" Type="Property" Value="true" />
          <Param Name="StatusCheckIterationWaitSeconds" Type="Property" Value="4" />
          <Param Name="StatusCheckMaxWaitSeconds" Type="Property" Value="7200" />
          <Param Name="StatusContinueValue" Type="Property" Value="PENDING|RUNNING" />
          <Param Name="StatusSuccessValue" Type="Property" Value="DONE" />
        </Params>
        <EndPoint Name="get_job_result" Template="PaginateRows" Url="projects/[$ProjectId$]/queries/[$parent.[0].jobId$]?location=[$Location$]" Method="GET" Filter="$.rows[*]">
          <Params>
            <Param Name="EnableStatusCheck" Type="Property" Value="false" />
            <!--Param Name="NextUrlAttributeOrExpr" Type="Property" Value="$.pageToken" Hidden="True"/>
            <Param Name="NextUrlSuffix" Type="Property" Value="pageToken=&lt;%nextlink%&gt;" Hidden="True"/-->

            <Param Name="ArrayTransformType" Type="Property" Value="TransformComplexTwoDimensionalArray"  />
            <Param Name="ArrayTransRowValueFilter" Type="Property" Value="$.f[*].v" />
            <Param Name="ArrayTransColumnNameFilter" Type="Property" Value="$.schema.fields[*].name"  />
          </Params>
        </EndPoint>
      </EndPoint>

      <OutputColumns>
        <Column Name="-Dynamic-" Expand="True" DataEndPoint="get_query_schema"
                ColumnInfoMap="Name=name;DataType=type" DataTypeMap="DT_I8=integer,int64;DT_WSTR=string,record;DT_DBTIMESTAMP=datetime;DT_DBDATE=date;DT_BOOL=bool,boolean;DT_R8=float,numeric,timestamp;DT_STR=bytes;DT_NUMERIC=bignumeric,bigdecimal" />
      </OutputColumns>

    </EndPoint>

    <EndPoint  Name="read_table_data" Label="Read Table Rows" Group="Table" Desc="Gets the specified table resource by table ID. This method does not return the data in the table, it only returns the table resource, which describes the structure of this table."
          HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/tabledata/list"
          Url="projects/[$ProjectId$]/datasets/[$DatasetId$]/tables/[$TableId$]/data" Method="GET" Filter="$.rows[*]" MetaDetectionOrder="Static"
           Template="PaginateRows">
      <Params>
        <Param Name="ProjectId" Label="&#009;&#009;ProjectId" Value="" OptionsEndPoint="list_projects" OptionsEndPointValueColumn="projectId" OptionsEndPointLabelColumn="name" Desc="Leave this value blank to use ProjectId from connection settings" />
        <Param Name="DatasetId" Label="&#009;DatasetId" Desc="Leave this value blank to use DatasetId from connection settings" OptionsEndPoint="list_datasets" OptionsEndPointValueColumn="datasetId" OptionsEndPointLabelColumn="datasetId" />
        <Param Name="TableId" Label="&#009;TableId"  Required="True" OptionsEndPoint="list_tables" OptionsEndPointParameters="DatasetId=[$DatasetId$]" OptionsEndPointValueColumn="tableId" OptionsEndPointLabelColumn="tableId" />

        <!--Param Name="NextUrlAttributeOrExpr" Type="Property" Value="$.pageToken" Hidden="True"/>
        <Param Name="NextUrlSuffix" Type="Property" Value="pageToken=&lt;%nextlink%&gt;" Hidden="True"/-->
        <Param Name="ArrayTransformType" Type="Property" Value="TransformComplexTwoDimensionalArray"  />
        <Param Name="ArrayTransRowValueFilter" Type="Property" Value="$.f[*].v" />
      </Params>

      <OutputColumns>
        <Column Name="-Dynamic-" Expand="True"
                DataEndPoint="get_table_schema"
                DataEndPointParameters="DatasetId=[$parent.tableReference.datasetId$];TableId=[$parent.tableReference.tableId$]"
                ColumnInfoMap="Name=name;DataType=type"
          DataTypeMap="DT_I8=integer,int64;DT_WSTR=string,record;DT_DBTIMESTAMP=datetime;DT_DBDATE=date;DT_BOOL=bool,boolean;DT_R8=float,numeric,timestamp;DT_STR=bytes;DT_NUMERIC=bignumeric,bigdecimal" />
      </OutputColumns>

    </EndPoint>

    <EndPoint  Name="get_[$parent.tableReference.datasetId$]_[$parent.tableReference.tableId$]" Label="[$parent.tableReference.datasetId$].[$parent.tableReference.tableId$]"
               Group="Records" Desc="Read data from [$parent.tableReference.datasetId$].[$parent.tableReference.tableId$] for project ."
               HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/tabledata/list"
               Url="projects/[$parent.tableReference.projectId$]/datasets/[$parent.tableReference.datasetId$]/tables/[$parent.tableReference.tableId$]/data"
               Method="GET" Filter="$.rows[*]" MetaDetectionOrder="Static"
               Template="PaginateRows"
               Expand="True"
               DataEndPoint="list_tables"
      >
      <Params>
        <Param Name="ArrayTransformType" Type="Property" Value="TransformComplexTwoDimensionalArray"  />
        <Param Name="ArrayTransRowValueFilter" Type="Property" Value="$.f[*].v" />
      </Params>

      <OutputColumns>
        <Column Name="-Dynamic-"
                Expand="True"
                DataEndPoint="get_table_schema"
                ColumnInfoMap="Name=name;DataType=type"
                DataEndPointParameters="DatasetId=[$parent.tableReference.datasetId$];TableId=[$parent.tableReference.tableId$]"
                DataTypeMap="DT_I8=integer,int64;DT_WSTR=string,record;DT_DBTIMESTAMP=datetime;DT_DBDATE=date;DT_BOOL=bool,boolean;DT_R8=float,numeric,timestamp;DT_STR=bytes;DT_NUMERIC=bignumeric,bigdecimal"
      />
      </OutputColumns>
    </EndPoint>


    <EndPoint Name="list_projects" Label="List Projects"  Group="Project" Desc="Lists Projects that the caller has permission on and satisfy the specified filter."
	HelpLink="https://cloud.google.com/resource-manager/reference/rest/v1/projects/list"
	Url="https://cloudresourcemanager.googleapis.com/v1/projects?filter=[$SearchFilter$]" Method="GET" Filter="$.projects[*]"
       Template="Paginate" CachedTtl="60">
      <Params>
        <Param Name="SearchFilter" Value=""
		Desc="An expression for filtering the results of the request. Filter rules are case insensitive. If multiple fields are included in a filter query, the query will return results that match any of the fields. Some eligible fields for filtering are: name, id, labels.{key} (where key is the name of a label), parent.type, parent.id, lifecycleState. Example: name:how* "
		/>

      </Params>
      <OutputColumns>
        <Column Name="projectId" DataType="DT_WSTR" Length="255" />
        <Column Name="name" DataType="DT_WSTR" Length="255"/>
        <Column Name="projectNumber" DataType="DT_WSTR" Length="255" />
        <Column Name="lifecycleState" DataType="DT_WSTR" Length="255"/>
        <Column Name="createTime" DataType="DT_DBTIMESTAMP"/>
      </OutputColumns>
    </EndPoint>

    <EndPoint Name="list_datasets" Label="List Datasets" Group="Dataset" Desc="Lists all BigQuery datasets in the specified project to which the user has been granted the READER dataset role."
	HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets/list"
	Url="projects/[$ProjectId$]/datasets?filter=[$SearchFilter$]" Method="GET" Filter="$.datasets[*]" Template="Paginate" CachedTtl="60">
      <Params>
        <Param Name="ProjectId" OptionsEndPoint="list_projects" OptionsEndPointValueColumn="projectId" OptionsEndPointLabelColumn="name" Value=""  />
        <Param Name="SearchFilter" Value="" Desc="An expression for filtering the results of the request. Filter rules are case insensitive. If multiple fields are included in a filter query, the query will return results that match any of the fields. Some eligible fields for filtering are: name, id, labels.{key} (where key is the name of a label), parent.type, parent.id, lifecycleState. Example: name:how*" />

        <Param Name="all" Type="Query" Value="" Desc="Whether to list all datasets, including hidden ones" Options=";True;False"/>
      </Params>

      <OutputColumns>
        <Column Name="datasetReference.datasetId" Label="datasetId" DataType="DT_WSTR" Length="255" />
        <Column Name="datasetReference.projectId" Label="projectId" DataType="DT_WSTR" Length="255" />

        <Column Name="kind" DataType="DT_WSTR" Length="100"/>
        <Column Name="id" DataType="DT_WSTR" Length="100"/>
        <Column Name="location" DataType="DT_WSTR" Length="100"/>
      </OutputColumns>

    </EndPoint>

    <EndPoint Name="post_dataset" Label="Create Dataset" Group="Dataset" Desc="Creates a new empty dataset."
 HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets/insert"
 Url="projects/[$ProjectId$]/datasets" Method="POST" ContentType="application/json">

      <Body><![CDATA[{"description": "[$Description$]", "datasetReference": { "datasetId": "[$Name$]"} } ]]></Body>
      <Params>
        <Param Name="ProjectId" OptionsEndPoint="list_projects" OptionsEndPointValueColumn="projectId" OptionsEndPointLabelColumn="name" Value=""  />
        <Param Name="Name" Label="Dataset Name" Value="" Desc="Enter dataset name" Functions="JSONENC" Required="True" />
        <Param Name="Description" Value="This is dataset description" Functions="JSONENC" />
      </Params>

      <OutputColumns>
        <Column Name="datasetReference.datasetId" Label="datasetId" DataType="DT_WSTR" Length="255" />
        <Column Name="datasetReference.projectId" Label="projectId" DataType="DT_WSTR" Length="255" />

        <Column Name="kind" DataType="DT_WSTR" Length="100"/>
        <Column Name="id" DataType="DT_WSTR" Length="100"/>
        <Column Name="location" DataType="DT_WSTR" Length="100"/>
        <Column Name="friendlyName" DataType="DT_WSTR" Length="255"/>
        <Column Name="description" DataType="DT_WSTR" Length="1000"/>
        <Column Name="access" DataType="DT_WSTR" Length="1000"/>
      </OutputColumns>

    </EndPoint>

    <EndPoint Name="delete_dataset" Label="Delete Dataset" Group="Dataset" Desc="Deletes the dataset specified by the datasetId value. Before you can delete a dataset, you must delete all its tables, either manually or by specifying deleteContents. Immediately after deletion, you can create another dataset with the same name."
   HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets/delete"
   Url="projects/[$ProjectId$]/datasets/[$DatasetId$]" Method="DELETE" >
      <Params>
        <Param Name="ProjectId" OptionsEndPoint="list_projects" OptionsEndPointValueColumn="projectId" OptionsEndPointLabelColumn="name" Value=""  />
        <Param Name="DatasetId" Label="&#009;DatasetId" Required="True" OptionsEndPoint="list_datasets" OptionEndPointParameters="ProjectId=[$ProjectId$];" OptionsEndPointValueColumn="datasetId" OptionsEndPointLabelColumn="id" Value=""  />
        <Param Name="deleteContents" Label="&#009;&#009;Delete All Tables" Type="Query" Options=";True;False" Desc="If True, delete all the tables in the dataset. If False and the dataset contains tables, the request will fail. Default is False"/>
      </Params>
      <OutputColumns>
        <Column Name="Response" DataType="DT_WSTR" Length="255" />
      </OutputColumns>
    </EndPoint>

    <EndPoint Name="delete_table" Label="Delete Table" Group="Table" Desc="Deletes the dataset specified by the datasetId value. Before you can delete a dataset, you must delete all its tables, either manually or by specifying deleteContents. Immediately after deletion, you can create another dataset with the same name."
   HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/tables/delete"
   Url="projects/[$ProjectId$]/datasets/[$DatasetId$]/tables/[$TableId$]" Method="DELETE" >
      <Params>
        <Param Name="ProjectId" Label="&#009;&#009;ProjectId" Value="" OptionsEndPoint="list_projects" OptionsEndPointValueColumn="projectId" OptionsEndPointLabelColumn="name" />
        <Param Name="DatasetId" Label="&#009;DatasetId" OptionsEndPoint="list_datasets" OptionEndPointParameters="ProjectId=[$ProjectId$];" OptionsEndPointValueColumn="datasetId" OptionsEndPointLabelColumn="id" Value=""  />
        <Param Name="TableId" Label="TableId" Required="True" OptionsEndPoint="list_tables" OptionsEndPointParameters="DatasetId=[$DatasetId$]" OptionsEndPointValueColumn="tableId" OptionsEndPointLabelColumn="tableId" />
      </Params>
      <OutputColumns>
        <Column Name="Response" DataType="DT_WSTR" Length="255" />
      </OutputColumns>
    </EndPoint>

    <EndPoint Name="list_tables" Label="List Tables" Group="Table" Desc="Lists BigQuery Tables for the specified project / dataset to which the user has been granted the READER dataset role."
	HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/tables/list"
	Url="projects/[$ProjectId$]/datasets/[$DatasetId$]/tables" Method="GET" Filter="$.tables[*]"  Template="Paginate" CachedTtl="60">
      <Params>
        <Param Name="ProjectId" Label="&#009;&#009;ProjectId" Value="" OptionsEndPoint="list_projects" OptionsEndPointValueColumn="projectId" OptionsEndPointLabelColumn="name" />
        <Param Name="DatasetId" Label="&#009;DatasetId" Required="True" OptionsEndPoint="list_datasets" OptionEndPointParameters="ProjectId=[$ProjectId$];" OptionsEndPointValueColumn="datasetId" OptionsEndPointLabelColumn="id" Value=""  />
      </Params>

      <OutputColumns>
        <Column Name="tableReference.tableId" Label="tableId" DataType="DT_WSTR" Length="255" />
        <Column Name="tableReference.datasetId" Label="datasetId" DataType="DT_WSTR" Length="255" />
        <Column Name="tableReference.projectId" Label="projectId" DataType="DT_WSTR" Length="255" />

        <Column Name="kind" DataType="DT_WSTR" Length="100"/>
        <Column Name="id" DataType="DT_WSTR" Length="100"/>
        <Column Name="type" DataType="DT_WSTR" Length="100"/>
        <Column Name="creationTime" DataType="DT_DBTIMESTAMP" Functions="UNIX_TIMESTAMP_MS_TO_DATE"/>
      </OutputColumns>

    </EndPoint>

    <EndPoint  Name="get_query_schema" Group="Query" Label="Get Query Schema (From SQL)" Desc="Runs a BigQuery SQL query synchronously and returns query schema"
	HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query"
	Url="projects/[$ProjectId$]/queries" Method="POST" Filter="$.schema.fields[*]"
  ContentType="application/json" Hidden="True" CachedTtl="60" UseRawCache="True">
      <Body><![CDATA[{"query":"[$Sql$]","maxResults":10,"timeoutMs":[$timeoutMs$],"useLegacySql":[$useLegacySql$] , "location" : "[$Location$]" }]]></Body>
      <Params>
        <Param Name="Sql" Label="SQL Query" Editor="MultiLine" Required="True" Functions="JSONENC" Value=""/>
        <Param Name="useLegacySql" Label="Use Legacy SQL Syntax?" Options="false;true" Value="false" />
        <Param Name="timeoutMs" Value="90000" Label="timeout (Milliseconds)" Desc="Wait until timeout is reached." Options="false;true" />
		<Param Name="Location" Template="LocationParam" />
      </Params>
      <OutputColumns>
        <Column Name="name" DataType="DT_WSTR" Length="255" />
        <Column Name="type" DataType="DT_WSTR" Length="255" />
      </OutputColumns>
    </EndPoint>

    <EndPoint  Name="get_table_schema" Label="Get Table Schema" Group="Table" Desc="Gets the specified table resource by table ID. This method does not return the data in the table, it only returns the table resource, which describes the structure of this table."
      HelpLink="https://cloud.google.com/bigquery/docs/reference/rest/v2/tables/get"
      Url="projects/[$ProjectId$]/datasets/[$DatasetId$]/tables/[$TableId$]" Method="GET" Filter="$.schema.fields[*]" CachedTtl="60">
      <Params>
        <Param Name="DatasetId" Label="DatasetId" Required="True" OptionsEndPoint="list_datasets" OptionsEndPointValueColumn="datasetId" OptionsEndPointLabelColumn="datasetId" />
        <Param Name="TableId" Label="TableId"  Required="True" OptionsEndPoint="list_tables" OptionsEndPointParameters="DatasetId=[$DatasetId$]" OptionsEndPointValueColumn="tableId" OptionsEndPointLabelColumn="tableId" />
      </Params>
      <OutputColumns>
        <Column Name="name" DataType="DT_WSTR" Length="255" />
        <Column Name="type" DataType="DT_WSTR" Length="255" />
      </OutputColumns>
    </EndPoint>

    <EndPoint Name="insert_table_data" Group="Table"
              Url="projects/[$ProjectId$]/datasets/[$DatasetId$]/tables/[$TableId$]/insertAll"
              EnableRequestCompression="True"
              Method="POST"
              BatchSize="10000"
              MaxRequestSize="9MB"              
              ContentType="application/json"
              Filter="$.insertErrors[*].errors[*]"
              FailIf="RowsFoundAndNotHandled"
              IncludeParentColumns="True"
              >
      <Params>
        <Param Name="ProjectId" Label="&#009;&#009;ProjectId" Value="" OptionsEndPoint="list_projects" OptionsEndPointValueColumn="projectId" OptionsEndPointLabelColumn="name" />
        <Param Name="DatasetId" Label="&#009;DatasetId" OptionsEndPoint="list_datasets" OptionsEndPointValueColumn="datasetId" OptionsEndPointLabelColumn="datasetId" />
        <Param Name="TableId" Label="TableId" OptionsEndPoint="list_tables" OptionsEndPointParameters="DatasetId=[$DatasetId$]" OptionsEndPointValueColumn="tableId" OptionsEndPointLabelColumn="tableId" />
      </Params>
      <Body><![CDATA[{"kind" : "bigquery#tableDataInsertAllRequest", "rows" : {$rows$}}]]></Body>
      <LayoutMap>
        <![CDATA[<?xml version="1.0" encoding="utf-8"?>
<settings>
  <dataset id="root" readfrominput="True" />
  <map name="json">
      <map src="*"  />
  </map>  
</settings>]]>
      </LayoutMap>
      <OutputColumns>
        <Column Name="P_insertErrors_index" Label="index" DataType="DT_I8" />
        <Column Name="reason" DataType="DT_WSTR" Length="255"/>
        <Column Name="location" DataType="DT_WSTR" Length="255"/>
        <Column Name="debugInfo" DataType="DT_WSTR" Length="4000"/>
        <Column Name="message" DataType="DT_WSTR" Length="4000"/>
      </OutputColumns>
    </EndPoint>

    <EndPoint Name="post_[$parent.tableReference.datasetId$]_[$parent.tableReference.tableId$]" Group="Records"
              Url="projects/[$parent.tableReference.projectId$]/datasets/[$parent.tableReference.datasetId$]/tables/[$parent.tableReference.tableId$]/insertAll"
              Expand="True"
              DataEndPoint="list_tables"
              EnableRequestCompression="True"
              Method="POST"
              BatchSize="10000"
              MaxRequestSize="9MB"
              ContentType="application/json"
              Filter="$.insertErrors[*].errors[*]"
              FailIf="RowsFoundAndNotHandled"
              IncludeParentColumns="True"
              >
      <Params>
        <!--Param Name="DatasetId" Label="&#009;DatasetId" OptionsEndPoint="list_datasets" OptionsEndPointValueColumn="datasetId" OptionsEndPointLabelColumn="datasetId" />
        <Param Name="TableId" Label="TableId" OptionsEndPoint="list_tables" OptionsEndPointParameters="DatasetId=[$DatasetId$]" OptionsEndPointValueColumn="tableId" OptionsEndPointLabelColumn="tableId" />
        <Param Name="EnableRequestCompression" Value="True" Hidden="True" Type="Property" /-->
      </Params>
      <Body><![CDATA[{"kind" : "bigquery#tableDataInsertAllRequest", "rows" : {$rows$}}]]></Body>
      <LayoutMap>
        <![CDATA[<?xml version="1.0" encoding="utf-8"?>
<settings>
  <dataset id="root" readfrominput="True" />
  <map name="json">
      <map src="*"  />
  </map>  
</settings>]]>
      </LayoutMap>
      <OutputColumns>
        <Column Name="P_insertErrors_index" Label="index" DataType="DT_I8" />
        <Column Name="reason" DataType="DT_WSTR" Length="255"/>
        <Column Name="location" DataType="DT_WSTR" Length="255"/>
        <Column Name="debugInfo" DataType="DT_WSTR" Length="4000"/>
        <Column Name="message" DataType="DT_WSTR" Length="4000"/>
      </OutputColumns>
      <InputColumns>
        <Column Name="-Dynamic-"
                Expand="True"
                DataEndPoint="get_table_schema"
                DataEndPointParameters="DatasetId=[$parent.tableReference.datasetId$];TableId=[$parent.tableReference.tableId$]"
                ColumnInfoMap="Name=name;DataType=type" DataTypeMap="DT_I8=integer,int64;DT_WSTR=string,record;DT_DBTIMESTAMP=datetime;DT_DBDATE=date;DT_BOOL=bool,boolean;DT_R8=float,numeric,timestamp;DT_STR=bytes;DT_NUMERIC=bignumeric,bigdecimal"
      />
      </InputColumns>
    </EndPoint>


  </EndPoints>

  <Tables>
    <!-- 
    <Table Name="Projects" Desc="List projects for which user have access"
           LookupEndPoint="list_projects"
           SelectEndPoint="list_projects" />
    
    <Table Name="Datasets" Desc="Insert, update, delete and list datasets"
           LookupEndPoint="list_datasets"
           SelectEndPoint="list_datasets"
           InsertEndPoint="post_dataset"
           DeleteEndPoint="delete_dataset" 
           />
    -->
    <Table Name="[$parent.tableReference.tableId$]" Label="[$parent.tableReference.datasetId$].[$parent.tableReference.tableId$]"
           SelectEndPoint="get_[$parent.tableReference.datasetId$]_[$parent.tableReference.tableId$]"
           InsertEndPoint="post_[$parent.tableReference.datasetId$]_[$parent.tableReference.tableId$]"
           LookupEndPoint="get_query"
           Expand="True"
           DataEndPoint="list_tables">
    </Table>
  </Tables>
  <Examples>
    <Example Group="ODBC" Label="Native Query (ServerSide): Query using Simple SQL"
             Desc="Server side BigQuery SQL query example. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL). Query free dataset table (bigquery-public-data.samples.wikipedia)"
             HelpLink="https://cloud.google.com/bigquery/public-data"
             Code="#DirectSQL SELECT * FROM bigquery-public-data.samples.wikipedia LIMIT 1000 /* try your own dataset or Some FREE dataset like nyc-tlc.yellow.trips -- 3 parts ([Project.]Dataset.Table) */"></Example>

    <Example Group="ODBC" IsDefault="True" Label="Native Query (ServerSide): Query using Complex SQL"
             Desc="Server side SQL query example of BigQuery. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL). Query free dataset table (bigquery-public-data.usa_names.usa_1910_2013)"
             HelpLink="https://cloud.google.com/bigquery/public-data" >
      <Code>
        <![CDATA[#DirectSQL 
SELECT name, gender, SUM(number) AS total
FROM bigquery-public-data.usa_names.usa_1910_2013
GROUP BY name, gender
ORDER BY total DESC
LIMIT 10 
]]>
      </Code>
    </Example>

    <Example Group="ODBC" Label="Native Query (ServerSide): Create Table / Run Other DDL"
             Desc="Example of how to run Valid BigQuery DDL statement. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL)"
             Code="#DirectSQL CREATE TABLE TestDataset.Table1 (ID INT64,Name STRING,BirthDate DATETIME, Active BOOL)"
             HelpLink="https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language#create_table_statement" ></Example>
    <Example Group="ODBC" Label="Native Query (ServerSide): UPDATE Table data for complex types (e.g. Nested RECORD, Geography, JSON)"
             Desc="Example of how to run Valid BigQuery DML statement ()e.g. UPDATE / INSERT / DELETE). This usecase shows how to update record with complex data types such as RECORD (i.e Array), Geography, JSON and more. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL)"
             HelpLink="https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language#create_table_statement" >
<Code>
        <![CDATA[#DirectSQL 
#DirectSQL 
Update TestDataset.DataTypeTest 
Set ColTime='23:59:59.123456',
 ColGeography=ST_GEOGPOINT(34.150480, -84.233870),
 ColRecord=(1,"AA","Column3 data"),
 ColBigNumeric=1222222222222222222.123456789123456789123456789123456789,
 ColJson= JSON_ARRAY('{"doc":1, "values":[{"id":1},{"id":2}]}') 
Where ColInteger=1
]]>
      </Code>				
			 </Example>			 
    <Example Group="ODBC" Label="Native Query (ServerSide): DROP Table (if exists)  / Other DDL"
             Desc="Example of how to run Valid BigQuery DDL statement. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL)"
             Code="#DirectSQL DROP TABLE IF EXISTS Myproject.Mydataset.Mytable"
             HelpLink="https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language#drop_table_statement" ></Example>
    <Example Group="ODBC" Label="Native Query (ServerSide): Call Stored Procedure"
             Desc="Example of how to run BigQuery Stored Procedure and pass parameters. Assuming you created a valid stored proc called usp_GetData in TestDataset, call like below."
             Code="#DirectSQL CALL TestDataset.usp_GetData(1)"
             HelpLink="https://cloud.google.com/bigquery/docs/procedures" ></Example>

    <Example Group="ODBC" Label="INSERT Single Row" Code="INSERT INTO MyBQTable1(SomeBQCol1, SomeBQCol2) Values(1,'AAA')&#10;--WITH(DatasetId='TestDataset',Output='*')&#10;--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')" Desc="This is sample how you can insert into BigQuery using ZappySys query language. You can also use ProjectId='myproject-id' in WITH clause." ></Example>
    <Example Group="ODBC" Label="INSERT Multiple Rows from SQL Server" Code="INSERT INTO MyBQTable1 &#10;SOURCE(&#10;    'MSSQL'&#10;  , 'Data Source=localhost;Initial Catalog=tempdb;Initial Catalog=tempdb;Integrated Security=true'&#10;  , 'SELECT Col1 as SomeBQCol1,Col2 as SomeBQCol2 FROM SomeTable Where SomeCol=123'&#10;)&#10;--WITH(DatasetId='TestDataset',Output='*')&#10;--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')" Desc="This example shows how to bulk insert into Google BigQuery Table from microsoft SQL Server as external source. Notice that INSERT is missing column list. Its provided by source query so must produce valid column names found in target BQ Table (you can use SQL Alias in Column name to produce matching names)" HelpLink="https://zappysys.com/links/?id=10114" ></Example>
    <Example Group="ODBC" Label="INSERT Multiple Rows from any ODBC Source (DSN)" Code="INSERT INTO MyBQTable1 &#10;SOURCE(&#10;    'ODBC'&#10;  , 'DSN=MyDsn'&#10;  , 'SELECT Col1 as SomeBQCol1,Col2 as SomeBQCol2 FROM SomeTable Where SomeCol=123'&#10;) &#10;WITH(DatasetId='TestDataset')" Desc="This example shows how to bulk insert into Google BigQuery Table from any external ODBC Source (Assuming you have installed ODBC Driver and configured DSN). Notice that INSERT is missing column list. Its provided by source query so it must produce valid column names found in target BQ Table (you can use SQL Alias in Column name to produce matching names)" HelpLink="https://zappysys.com/links/?id=10114" ></Example>
    <Example Group="ODBC" Label="INSERT Multiple Rows from any JSON Files / API (Using ZappySys ODBC JSON Driver)" Code="INSERT INTO MyBQTable1 &#10;SOURCE(&#10;    'ODBC'&#10;  , 'Driver={ZappySys JSON Driver};Src='https://some-url/get-data''&#10;  , 'SELECT Col1 as SomeBQCol1,Col2 as SomeBQCol2 FROM _root_'&#10;)&#10;--WITH(DatasetId='TestDataset',Output='*')&#10;--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')" Desc="This example shows how to bulk insert into Google BigQuery Table from any external ODBC JSON API / File Source (Assuming you have installed ZappySys ODBC Driver for JSON). Notice that INSERT is missing column list. Its provided by source query so it must produce valid column names found in target BQ Table (you can use SQL Alias in Column name to produce matching names). You can also use similar approach to read from CSV files or XML Files. Just use CSV / XML driver rather than JSON driver in connection string. Refer this for more examples of JSON Query https://zappysys.com/onlinehelp/odbc-powerpack/scr/json-odbc-driver-sql-query-examples.htm" HelpLink="https://zappysys.com/links/?id=10114" ></Example>

    <Example Group="ODBC" Label="List Projects" Code="SELECT * FROM list_projects" Desc="Lists Projects for which user has access" ></Example>
    <Example Group="ODBC" Label="List Datasets" Code="SELECT * FROM list_datasets&#10;--WITH(ProjectId='MyProjectId')" Desc="Lists Datasets for specified project. If you do not specify ProjectId then it will use connection level details." ></Example>
    <Example Group="ODBC" Label="List Tables" Code="SELECT * FROM list_tables&#10;--WITH(ProjectId='MyProjectId')&#10;--WITH(ProjectId='MyProjectId',DatasetId='MyDatasetId')"  Desc="Lists tables for specified project / dataset. If you do not specify ProjectId or datasetId then it will use connection level details." ></Example>
    <Example Group="ODBC" Label="Delete dataset" Desc="Delete dataset for specified ID. If you like to delete all tables under that dataset then set deleteContents='true'"
             Code="SELECT * FROM delete_dataset WITH(DatasetId='MyDatasetId', deleteContents='False')"  ></Example>


  </Examples>
</ApiConfig>