update <table_name> set field_name = 'ABC' || chr(38) || 'DEF' where <where clause>
38 is the ASCII code for ampersand (&), and in this form it will be interpreted as a string, nothing else. I tried it and it worked.
All Education of Pakistani colleges, schools, universities, foreign study, scholarships, results, admissions and courses. Intermediate English Notes, Good-bye Mr. Chips, Online Taleem, Taleem Microsoft, Management Sciences, Computer Sciences Free tutorials, C++ Library, MS Access, Internet of Things, Rural Marketing, Angular2, Framework7, Firebase, Design Thinking, Internet Security, Excel Charts, Computer Security, Conversion Rate Optimization, Business Law, Laravel, SQL Server, Windows10
Pretend you're managing a combination gas station and convenience store. You're open 24 hours a day, seven days per week. It is January 2013. Your target revenue goal for the month is $500,000. Divide by 31 days in January, and you must push at least $16,129 in sales per day. The following table tracks your daily sales progress:Example Data: Download and execute avgExamples.sql to create the example tables and data used in this article.
SELECT * FROM daily_sales WHERE daily_date >= DATE '2013-01-01' AND daily_date < DATE '2013-02-01' ORDER BY daily_date; DAILY_DATE SALES_AMT ------------------ ---------- 01-JAN-13 17032 02-JAN-13 15043 03-JAN-13 16783 04-JAN-13 23343 05-JAN-13 25703 06-JAN-13 19538 07-JAN-13 18001 08-JAN-13 17154 09-JAN-13 26235 10-JAN-13 14983 11-JAN-13 16010 12-JAN-13 4300 13-JAN-13 2874Your numbers look good until the 12th, when they fall off a cliff. Freezing rain struck your area that day. Ice has played havoc on tree limbs and power lines, leaving you without power for most of two days. Gasoline has stayed in the underground tanks unpumped. Coffee pots have been cold. For two days all you've sold are potato chips and soda pop. You are worried. Your early numbers are good, but have these two days without power been enough to put you below where you need to be?
SELECT ROUND(AVG(sales_amt)) FROM daily_sales WHERE daily_date >= DATE '2013-01-01' AND daily_date < DATE '2013-02-01'; ROUND(AVG(SALES_AMT)) --------------------- 16692Whew! This is good news. Even with those two very bad days, you're still trending above your $16,129 target. Your hard work at exceeding plan is paying off by providing you a buffer in these difficult circumstances.
SELECT EXTRACT (MONTH FROM daily_date) AS "Month", ROUND(AVG(sales_amt)) AS "Avg Sales" FROM daily_sales WHERE daily_date >= DATE '2012-11-01' AND daily_date < DATE '2013-01-01' GROUP BY EXTRACT (MONTH FROM daily_date) ORDER BY EXTRACT (MONTH FROM daily_date); Month Avg Sales ---------- ---------- 11 18817 12 18803These numbers are great news. You're current mean daily sales is just barely above the line at $16,692, but your track record from the prior two months indicates significant upside potential.
SELECT daily_date, sales_amt, ROUND(AVG(sales_amt) OVER (ORDER BY daily_date)) AS "Running Avg" FROM daily_sales WHERE daily_date >= DATE '2012-12-01' and daily_date < DATE '2013-01-01' ORDER BY daily_date; DAILY_DATE SALES_AMT Running Avg ------------------ ---------- ----------- 01-DEC-12 14530 14530 02-DEC-12 14230 14380 03-DEC-12 15830 14863 ... 22-DEC-12 19235 17610 23-DEC-12 20133 17720 24-DEC-12 22839 17933 25-DEC-12 21833 18089 26-DEC-12 20593 18185 27-DEC-12 20001 18252 28-DEC-12 19848 18309 29-DEC-12 23134 18476 30-DEC-12 22783 18619 31-DEC-12 24323 18803These results are excellent news! They show a clear and sharp movement upwards in the running average toward the end of the month. Querying for November sales, you see the very same pattern in that month. There's every reason to believe you can repeat the pattern for January. You confidently report to your regional manager that the power loss is a minor blip and nothing to be concerned about. Your daily mean sales remain above the mark; you are on track to end the month ahead of plan.
Note: Means can be useful in communicating the big picture swiftly. Recite a list of daily sales numbers and a busy manager's eyes will glaze over. Give him a single number instead, such as the mean daily sales so far in the month, and you've conveyed the big picture without burying the manager in the details.
SELECT AVG(sales_amt) AS "Normal", AVG(COALESCE(sales_amt, 0)) AS "Coerced" FROM null_demo; Normal Coerced ---------- ---------- 3 1.2And here is the underlying data:
SET NULL "NULL" SELECT * FROM null_demo; DAILY_DATE SALES_AMT ------------------ ---------- 01-FEB-13 2 02-FEB-13 4 03-FEB-13 NULL 04-FEB-13 NULL 05-FEB-13 NULLThe normal invocation of AVG takes only the non-null values 2 and 4 into account. It returns the mean as (2 + 4) / 2 = 3. The second implementation treats nulls as if they were zeros. The rather odd effect is to return a mean that does not fall between the values in the table. That mean has no meaning either. What possible business meaning can the value 1.2 convey in this scenario? None at all. Think about the following two statements:
SELECT ROUND(AVG(PRICE),2) AS "Avg Per Gal" FROM fleet_fuel WHERE buy_date >= DATE '2013-01-01' AND buy_date < DATE '2013-02-01'; Avg Per Gal ----------- 3.46This result looks great. Doesn't it? You've applied the AVG function. You've gotten a result. You're well under the mark. You feel your bonus is secure. Except you're wrong.
SELECT ROUND(SUM(gallons * price)/SUM(gallons),2) AS "Avg Per Gal" FROM fleet_fuel WHERE buy_date >= DATE '2013-01-01' AND buy_date < DATE '2013-02-01'; Avg Per Gal ----------- 3.52This result's more worrisome, isn't it? Your target is $3.50 per gallon. Your current, weighted mean is $3.52 per gallon. That bonus isn't so secure after all. Fortunately, you have time left in the month to act. Good thing you knew about weighted means. You may be able to salvage that bonus yet.
"The mean fuel price in the region is $3.55 per gallon."However, an even more meaningful statement is:
"Half the gas stations charge less than $3.50 per gallon, and half charge more."Here you are giving what is known as the median: $3.50 per gallon in this case. This is useful information, and I plan a follow-up article soon on how to compute the median and apply it. Your friend can reasonably decide to treat $3.50 as the maximum he'll pay per gallon. Knowing that half the fuel stations charge less, and presuming a reasonably even geographic distribution, your friend should be able to make the trip and hold out for the less-expensive gas stations.
"Half the gas stations charge less than $3.50 per gallon, and half charge more."
"The two most common prices are $3.46 per gallon, and $3.52 per gallon."With this information, your friend might decide to hold out for $3.46 per gallon or less. Again, that's presuming a not-too-uneven geographic distribution. (There is only so much you can convey, after all, with just three numbers).
Note: Median is in many cases more useful or desirable than the arithmetic mean. Yet SQL has a history of poor support for computing the median. (Oracle Database does currently support a MEDIANfunction). You'll find plenty of convoluted techniques in older books on SQL for computing the median, some that work well and others that work mostly – except for some corner cases. It's always been trivial to compute the mean though, and sometimes the mean has been chosen only for that reason. It's the "when all you have is a hammer" effect.
14530 + 14230 + 15830 ... 23134 + 22783 + 24323Two other common patterns you'll encounter involve rates and products. It's useful to know how to spot them so that you don't apply AVG when it shouldn't be applied.
60km at 60kph + 60km at 40kphThis is not the simple, additive pattern to which arithmetic mean applies. The correct mean speed is not 50 kph like you might expect. It's actually 48 kph, because you spend more time (more hours) driving the slower trip. Another type of mean, the harmonic mean, will give the correct answer and should generally be applied to situations like this involving rates. The math behind the harmonic mean, and especially why that math works, is really interesting and I plan a future article on the topic.
N * 1.10 * 1.20 * 1.30Once again, this is not an additive sequence. Rather, it's a series of products. The mean pay raise is not 20%. The geometric mean applies here, and the mean increase in pay works out to 19.72% per year. You can prove that by working through some math:
N * 1.10 * 1.20 * 1.30 = N * 1.1972 * 1.1972 * 1.1972
1.10 * 1.20 * 1.30 = 1.7160,
and 1.1972 * 1.1972 * 1.1972 = 1.7159,
and finally...N * 1.7160 ≈ N * 1.7159Rounding accounts for the difference of 0.0001 in the result. Geometric mean is also a topic I plan for a future article.
"What if you had two different ranges?
How do you generate individual rows for each day between 01/01 to 01/15, and between 03/15 to 04/02?"beg_date end_date 01-Jan-2013 15-Jan-2013 15-Mar-2013 02-Apr-2013
SELECT LEVEL, LAST_DAY(SYSDATE)-LEVEL+1
FROM dual
WHERE ROWNUM <= EXTRACT(DAY FROM LAST_DAY(SYSDATE))
CONNECT BY LEVEL=ROWNUM;
SELECT LEVEL, LAST_DAY(SYSDATE)-LEVEL+1
FROM dual
WHERE LEVEL <= EXTRACT(DAY FROM LAST_DAY(SYSDATE))
CONNECT BY LEVEL=ROWNUM;
VARIABLE beg1_date CHAR(11);
VARIABLE end1_date CHAR(11);
VARIABLE beg2_date CHAR(11);
VARIABLE end2_date CHAR(11);
EXECUTE :beg1_date := '1-Jan-2013';
EXECUTE :end1_date := '15-Jan-2013';
EXECUTE :beg2_date := '15-Mar-2013';
EXECUTE :end2_date := '2-Apr-2013';
From: SELECT LEVEL, LAST_DAY(SYSDATE)-LEVEL+1
To: SELECT LEVEL, TO_DATE(:end2_date)-LEVEL+1
SELECT LEVEL, TO_DATE(:end2_date)-LEVEL+1
FROM dual
WHERE LEVEL <= 10
CONNECT BY LEVEL=ROWNUM;
LEVEL TO_DATE(:END2_DATE
---------- ------------------
1 02-APR-13
2 01-APR-13
3 31-MAR-13
4 30-MAR-13
5 29-MAR-13
6 28-MAR-13
7 27-MAR-13
8 26-MAR-13
9 25-MAR-13
10 24-MAR-13
SELECT LEVEL, TO_DATE(:end2_date)-LEVEL+1
FROM dual
WHERE TO_DATE(:beg1_date) <= TO_DATE(:end2_date)-LEVEL+1
CONNECT BY LEVEL=ROWNUM;
LEVEL TO_DATE(:END2_DATE
---------- ------------------
1 02-APR-13
2 01-APR-13
3 31-MAR-13
...
90 03-JAN-13
91 02-JAN-13
92 01-JAN-13
SELECT LEVEL, TO_DATE(:end2_date)-LEVEL+1
FROM dual
WHERE TO_DATE(:beg1_date) <= TO_DATE(:end2_date)-LEVEL+1
AND NOT (
TO_DATE(:end2_date)-LEVEL+1 > TO_DATE(:end1_date)
AND TO_DATE(:end2_date)-LEVEL+1 < TO_DATE(:beg2_date)
)
CONNECT BY LEVEL=ROWNUM;
LEVEL TO_DATE(:END2_DATE
---------- ------------------
1 02-APR-13
2 01-APR-13
3 31-MAR-13
...
17 17-MAR-13
18 16-MAR-13
19 15-MAR-13
SELECT LEVEL, TO_DATE(:end2_date)-LEVEL+1
FROM dual
WHERE TO_DATE(:beg1_date) <= TO_DATE(:end2_date)-LEVEL+1
CONNECT BY LEVEL=ROWNUM
GROUP BY LEVEL, TO_DATE(:end2_date)-LEVEL+1
HAVING NOT (
TO_DATE(:end2_date)-LEVEL+1 > TO_DATE(:end1_date)
AND TO_DATE(:end2_date)-LEVEL+1 < TO_DATE(:beg2_date)
)
ORDER BY LEVEL;
LEVEL TO_DATE(:END2_DATE
---------- ------------------
1 02-APR-13
2 01-APR-13
3 31-MAR-13
...
17 17-MAR-13
18 16-MAR-13
19 15-MAR-13
78 15-JAN-13
79 14-JAN-13
80 13-JAN-13
...
90 03-JAN-13
91 02-JAN-13
92 01-JAN-13
SELECT day_of_range FROM (
SELECT LEVEL AS range_counter,
TO_DATE(:end2_date)-LEVEL+1 AS day_of_range
FROM dual
WHERE TO_DATE(:beg1_date) <= TO_DATE(:end2_date)-LEVEL+1
CONNECT BY LEVEL=ROWNUM
) WHERE NOT (
day_of_range > TO_DATE(:end1_date)
AND day_of_range < TO_DATE(:beg2_date)
);
SELECT LEVEL, TO_DATE(:end2_date)-LEVEL+1
FROM dual
WHERE TO_DATE(:beg2_date) <= TO_DATE(:end2_date)-LEVEL+1
CONNECT BY LEVEL=ROWNUM
UNION ALL
SELECT LEVEL, TO_DATE(:end1_date)-LEVEL+1
FROM dual
WHERE TO_DATE(:beg1_date) <= TO_DATE(:end1_date)-LEVEL+1
CONNECT BY LEVEL=ROWNUM;