Friday, June 14, 2013

What is UGC ---all information of how to become professor

University Grants Commission (India)

From Wikipedia, the free encyclopedia
University Grants Commission
AbbreviationUGC
MottoGyan-Vigyan Vimukte(Knowledge Liberates)
FormationDecember 28, 1953
HeadquartersNew Delhi
LocationIndia
ChairmanProf. Ved Prakash
AffiliationsDepartment of Higher EducationMinistry of Human Resource Development
Websitewww.ugc.ac.in
The University Grants Commission (UGC) of India is a statutory organisation set up by the Union government in 1956, charged with coordination, determination and maintenance of standards of university education. It provides recognition to universities in India, and disburses funds to such recognized universities and colleges. Prof. Ved Prakash, a noted academician and education administrator, is the incumbent Chairman of UGC, India.[1] Its headquarters are in New Delhi, and six regional centres in Pune, Bhopal, Kolkata, Hyderabad, Guwahati and Bangalore.[2]

Contents

  [hide

History [edit]

UGC was recommended in 1945 and formed in 1946 to oversee the work of the three Central Universities of Aligarh, Banaras and, Delhi. In 1947, the Committee was entrusted with the responsibility of dealing with all the then existing Universities. After independence, the University Education Commission was set up in 1948 under the Chairmanship of S. Radhakrishnan and it recommended that the UGC be reconstituted on the general model of the University Grants Commission of the United Kingdom.
UGC was formally inaugurated by Abul Kalam Azad, the Minister of Education, Natural Resources and Scientific Research on 28 December 1953.
The UGC was however, formally established in November 1956, by an Act of Parliament as a statutory body of the Government of India. In order to ensure effective region-wise coverage throughout the country, the UGC has decentralised its operations by setting up six regional centres at Pune,HyderabadKolkataBhopalGuwahati and Bangalore. The head office of the UGC is located at Bahadur Shah Zafar Marg in New Delhi, with two additional bureaus operating from 35, Feroze Shah Road and the South Campus of University of Delhi as well.

UGC's Mandate [edit]

The UGC has the unique distinction of being the only grant-giving agency in the country which has been vested with two responsibilities: that of providing funds and that of coordination, determination and maintenance of standards in institutions of higher education.
The UGC's mandate includes:
  • Promoting and coordinating university education.
  • Determining and maintaining standards of teaching, examination and research in universities.
  • Framing regulations on minimum standards of education.
  • Monitoring developments in the field of collegiate and university education; disbursing grants to the universities and colleges.
  • Serving as a vital link between the Union and state governments and institutions of higher learning.
  • Advising the Central and State governments on the measures necessary for improvement of university education.

Gate 2014 format

 About GATE 2014:
The Indian Institute of Science (IISc) and seven Indian Institutes of Technology (IITs at Bombay,
Delhi, Guwahati, Kanpur, Kharagpur, Madras and Roorkee) jointly administer the conduct of
GATE (Graduate Aptitude Test in Engineering) that serves as a means for admissions to
graduate programmes in various engineering and science disciplines in the country. The results
of GATE are also used by several Public Sector Undertakings to short list the candidates for
their recruitment. The operations related to GATE in each of the 8 zones are managed by a
zonal GATE Office at an IIT or IISc. The Organizing Institute (OI) is responsible for the end-toend process and coordination amongst the administering institutes. The Organizing Institute for
GATE 2014 is IIT Kharagpur. For GATE 2014, the examinations for all the papers will be
conducted in computer based Online Examination mode in multiple sessions spread over the
period from 1st February 2014 to 9
th March 2014. In the computer based Online Examination,
the candidates will be required to answer the questions that appear on a computer connected to
a server on a Local Area Network (LAN). Answers would be recorded at a server that also
keeps track of the time for the examination. We hereby seek a Company Partner (CP) to
provide the technology and the operational solution for the conduct of online examination for
GATE 2014.

UGC Net exam

National Eligibility Test

From Wikipedia, the free encyclopedia
The National Eligibility Test (NET) is the national level entrance examination in India for postgraduate candidates who wish to qualify for admission in PhD research and/or university level teaching jobs in India. For the humanities and art discipline it is administered by the University Grants Commission (referred to as "UGC NET"), while for the science and engineering discipline it is jointly conducted by Council of Scientific and Industrial Research (CSIR) and University Grants Commission ("CSIR-UGC" NET).

Contents

  [hide

Controversies[edit]

While a section of academicians advocated the abolition of the National Eligibility Test, the All India Federation of University and College Teachers’ Organisation (AIFUCTO) has strongly opposed attempts to abolish the National Eligibility Test (NET). The organization advocated that the test improves the standard of teaching in the country and instead of abolishing it all together the government must revisit some of its provisions that are making it flawed and unimplementable.[1]
Image depicting the eligibility criteria for June 2012 National Eligibility Test (NET) - a criteria which the UGC itself set aside in releasing the supplementary list which was largely non-criteria based
National Eligibility Test (NET) of June 2012 was the most controversial examination because its result was published on September 18, 2012. After the publication of the results, the Commission allegedly altered the Test’s qualification norms by mandating that candidates in the general category score an aggregate of 65 per cent for all three of the NET’s papers to become eligible for lectureship. The corresponding figure for the OBC category is 60 per cent and that of the SC/ST category is 55 per cent.According to the appearing students, in the original notification of the UGC it was specified that candidates in the general category should score at least 40 per cent for papers one and two and 50 per cent for paper three to be eligible for consideration for the final preparation of the result.[2]
This image highlights the anamolies of UGC NET Result of June 2012 where candidates getting less than 50% passed the test. UGC set an aggregate pass criteria General -65%, OBC - 60% and SC/ST - 55%.
In the light of the student protests and representations, the UGC released a supplementary list on November 12, 2012, which although qualified a few more candidates, but it did not specify any criteria for the revised list.[3] Added to this was the fact that there were anomalies in the results declared where candidates securing less than 50 percent aggregate were declared as qualified whereas many general candidates with more 60 percent remained unqualified.
More than 7000 candidates approached the Kerala High Court against the University Grants Commission (UGC). The Kerala High court declared as illegal the new norms fixed by UGC for the National Eligibility Test (NET) for college and university lectureship. The court held that fixing of higher aggregate marks for three categories (General, OBC and SC/ST), that too just before the announcements of results, cannot be justified as the same was "not supportable by law". [4]
In the light of this judgement, the University Grants Commission added the specific note "NOTIFICATION REGARDING PROCEDURE AND CRITERIA FOR DECLARATION OF RESULT OF UGC NET TO BE HELD ON 30TH DECEMBER, 2012 " on its website just two days before the examination.It also listed stepwise clearance criteria for candidates of different categories and subjects according to the competitive cutoffs fixed by the University Grants Commission, with an aim to clear top 15 percent candidates only. [5]

Social Media Postings about National Eligibility Test[edit]

Most 500 powerful computers

TOP500 most powerful computers


Exponential growth of supercomputers performance, based on data from top500.org site. The y-axis shows performance in GFLOPS. The red line denotes the fastest supercomputer in the world at the time. The yellow line denotes supercomputer no. 500 on TOP500 list. The dark blue line denotes the total combined performance of supercomputers on TOP500 list.
The TOP500 project ranks and details the 500 most powerful (non-distributedcomputer systems in the world. The project was started in 1993 and publishes an updated list of the supercomputers twice a year. The first of these updates always coincides with the International Supercomputing Conference in June, and the second one is presented in November at the ACM/IEEE Supercomputing Conference. The project aims to provide a reliable basis for tracking and detecting trends in high-performance computing and bases rankings on HPL,[1] a portable implementation of the High-Performance LINPACK benchmark written in Fortran for distributed-memory computers.
The TOP500 list is compiled by Hans Meuer of the University of MannheimGermanyJack Dongarra of the University of TennesseeKnoxville, and Erich Strohmaier and Horst Simon of NERSC/Lawrence Berkeley National Laboratory.

Contents

  [hide

History[edit]

In the early 1990s, a new definition of supercomputer was needed to produce meaningful statistics. After experimenting with metrics based on processor count in 1992, the idea was born at theUniversity of Mannheim to use a detailed listing of installed systems as the basis. Early 1993 Jack Dongarra was persuaded to join the project with his Linpack benchmark. A first test version was produced in May 1993, partially based on data available on the Internet, including the following sources:[2][3]
The information from those sources was used for the first two lists. Since June 1993, the TOP500 is produced bi-annually based on site and vendor submissions only.
Since 1993, performance of the #1 ranked position has steadily grown in agreement with Moore's law, doubling roughly every 14 months. As of November 2012, the fastest system, the Titan with Rpeak[4] of 27.1125 PFlop/s, is over 206,965 times faster than the fastest system in November 1993, the Connection Machine CM-5/1024 (1024 cores) with Rpeak of 131.0 GFlop/s.[5]

Top 10 ranking[edit]

The following table gives the Top 10 positions of the 40th TOP500 List released on November 12, 2012.
RankRmax
Rpeak
(Pflops)
NameComputer design
Processor type, interconnect
VendorSite
Country, year
Operating system
117.590
27.113
TitanCray XK7
Opteron 6274 + Tesla K20X, Custom
CrayOak Ridge National Laboratory (ORNL) in Tennessee
  United States, 2012
Linux (CLESLES based)
216.325
20.133
SequoiaBlue Gene/Q
PowerPC A2, Custom
IBMLawrence Livermore National Laboratory
  United States, 2011
Linux (RHEL and CNK)
310.510
11.280
K computerRIKEN
SPARC64 VIIIfx, Tofu
FujitsuRIKEN
  Japan, 2011
Linux
48.162
10.066
MiraBlue Gene/Q
PowerPC A2, Custom
IBMArgonne National Laboratory
  United States, 2012
Linux (RHEL and CNK)
54.141
5.033
JUQUEENBlue Gene/Q
PowerPC A2, Custom
IBMForschungszentrum Jülich
  Germany, 2012
Linux (RHEL and CNK)
62.897
3.185
SuperMUCiDataPlex DX360M4
Xeon E5–2680Infiniband
IBMLeibniz-Rechenzentrum
  Germany, 2012
Linux
72.660
3.959
StampedePowerEdge C8220
Xeon E5–2680Infiniband
DellTexas Advanced Computing Center
  United States, 2012
Linux
82.566
4.701
Tianhe-1ANUDT YH Cluster
Xeon 5670 + Tesla 2050, Arch[6]
NUDTNational Supercomputing Center of Tianjin
  China, 2010
Linux
91.725
2.097
FermiBlue Gene/Q
PowerPC A2, Custom
IBMCINECA
  Italy, 2012
Linux (RHEL and CNK)
101.515
1.944
DARPA Trial SubsetPower 775
POWER7, Custom
IBMIBM Development Engineering
  United States, 2012
Linux (RHEL)
Legend
  • Rank – Position within the TOP500 ranking. In the TOP500 List table, the computers are ordered first by their Rmax value. In the case of equal performances (Rmax value) for different computers, the order is by Rpeak. For sites that have the same computer, the order is by memory size and then alphabetically.
  • Rmax – The highest score measured using the LINPACK benchmark suite. This is the number that is used to rank the computers. Measured in quadrillions of floating point operations persecond, i.e. petaflops.
  • Rpeak – This is the theoretical peak performance of the system. Measured in Pflops.
  • Name – Some supercomputers are unique, at least on its location, and are therefore christened by its owner.
  • Computer – The computing platform as it is marketed.
  • Processor cores – The number of active processor cores actively used running Linpack. After this figure is the processor architecture of the cores named. If the interconnect between computing nodes is of interest, it's also included here.
  • Vendor – The manufacturer of the platform and hardware.
  • Site – The name of the facility operating the supercomputer.
  • Country – The country in which the computer is situated.
  • Year – The year of installation/last major update.
  • Operating System – The operating system that the computer uses.

Other rankings[edit]

Systems ranked #1 since 1993[edit]

Number of systems[edit]

By number of systems as of June 2012:[7]
Top processor architectures
Top vendors
Top regions
CountryNov 12Jun 12Nov 11Jun 11Nov 10Jun 10Nov 09Jun 09Nov 08Jun 08Nov 07
 United States250252263255276280277291291258284
 China7268746141252121151210
 Japan3235302626181615172220
 United Kingdom2425272724384443455247
 France2122232525292623263417
 Germany1920203026242730254731
 Canada1110986798225
 India85224536869
 Russia85512111184887
 Australia76464111111
 Italy784567661166
 Sweden643568710897
 Korea, South4334312111
 Poland45656534631
 Switzerland41344554467
 Finland31121321115
 Norway33013222223
 Saudi Arabia33346453
 Taiwan332212311
 Brazil2322211211
 Spain24323364679
 Austria11221285
 Belgium1212211221
 Denmark11222331
 Israel13321211
 Mexico111
 Slovak Republic11
 Ireland331111111
 Singapore11221111
 South Africa11111
 United Arab Emirates1
 Netherlands12433356
 Hong Kong111
 New Zealand5786461
 Slovenia111111
 Turkey11
 Bulgaria111
 Malaysia11123
 Cyprus11
 Egypt11
 Indonesia1
 Luxembourg1

Large machines not on the list[edit]

A few machines that have not been benchmarked are not eligible for the list: such as NCSA's Blue Waters. Additionally purpose built machines that are not capable or do not run the benchmark are not included: such as RIKEN MDGRAPE-3.