Thursday, December 28, 2017

Teen on a mission to clean up the city

Inspired by the ‘Swachh Bharat’ mission of Prime Minister Narendra Modi, Sidharth Runwal, a 17-year-old student of B.D.Somani International School is on a mission to improve the quality of life of ragpickers and garbage cleaners in Mumbai through his non-profit organisation Clean-Up Foundation.
The NGO aims to achieve the objective through various activities targeted at improving and enhancing the hygiene and sanitation factors around those who help in cleaning up the city but themselves live in appalling conditions.
Runawal said, “The first step towards improving the quality of life for the garbage cleaners of Mumbai is to provide them with clean drinking water. It is ironic that the people who clean the city every day live in such deplorable conditions.” He added that as per the BMC report, the average garbage cleaner does not live for over 40 years. “They do not have access to fundamental necessities such as potable water that are essential for survival,” he added.
“The Clean-Up Foundation installed water purifiers at multiple Chowkis across Bandra, Khar and Santa Cruz under the H/West Ward Office. This initiative ensured clean drinking water is available for over 12,000 garbage cleaners in Mumbai,” said Runawal.



Source: DNA-23rd December,2017

New system uses Twitter, AI to give early warnings of floods - Social media can be used to complement datasets, find researchers

London: Scientists are combining Twitter, citizen science and cutting-edge artificial intelligence (AI) techniques to develop an early-warning system for flood-prone communities.
Researchers from the University of Dundee in the UK have shown how AI can be used to extract data from Twitter and crowdsourced information from mobile phone apps to build up hyper-resolution monitoring of urban flooding.
Urban flooding is difficult to monitor due to complexities in data collection and processing. This prevents detailed risk analysis, flooding control, and the validation of numerical models.
Researchers set about trying to solve this problem by exploring how the latest AI technology can be used to mine social media and apps for the data that users provide.
They found that social media and crowdsourcing can be used to complement datasets based on traditional remote sensing and witness reports.
Applying these methods in case studies, they found these methods to be genuinely informative and that AI can play a key role in future flood warning and monitoring systems.
“Sea levels have been rising at an average rate of 3.4mm a year over the past decade. The extremes of today will become the average of the future so coastal cities and countries must take action to protect their land,” Dr Roger Wang said.
“We were particularly interested in the increased incidence of what we call sunny day flooding that occurs in the absence of any extreme weather event due to the mean sea level being higher,” he said.
“A tweet can be very informative in terms of flooding data. Key words were our first filter, then we used natural language processing to find out more about severity, location and other information,” Wang said.
“Computer vision techniques were applied to the data collected from MyCoast, a crowdsourcing app, to automatically identify scenes of flooding from the images that users post,” he added.
“We found these big data-based flood monitoring approaches can definitely complement the existing means of data collection and demonstrate great promise for improving monitoring and warnings in future,” he said.
Twitter data was streamed over a one-month period in 2015, with the filtering keywords of ‘flood’, ‘inundation’, ‘dam’, ‘dike’, and ‘levee’. More than 7,500 tweets were analysed over this time.
MyCoast is a system used by a number of environmental agencies to collect ‘citizen science’ data about various coastal hazards or incidents.
The system contains over 6,000 flood photographs, all of which were collected through the mobile app.
The information extracted by AI tools was validated against precipitation data and road closure reports to examine the quality of the data.
Flood-related tweets were shown to correlate to precipitation levels, while the crowdsourced data matched strongly with the road closure reports.
The researchers believe a tool like Twitter is more useful for large-scale, inexpensive monitoring, while the crowdsourced data provides rich and customised information at the micro level. 

 Source: DNA-27th December,2017

TECHFEST-IIT BOMBAY beckons


Syria, China, Iran, Afghanistan,Russia, US among the 20 plus countries that will participate this year

Source- DNA-27th December,2017

Wednesday, December 27, 2017

Poor ballast makes 10,000 km track dicey

TRICHY: Around 10,000 km of railway tracks across the country are sitting on a recipe for disaster, quite literally, as the ballast - the small, coarse stones that act as shock absorbers - have only a third of the mandated thickness.
A high-level meeting presided over by Ashwani Lohani, chairman of the Railway Board, with general managers of all 17 zones on November 27, discussed the issue of the thinning down of ballast in light of the derailment of the Vasco da Gama-Patna Express on November 24.
Thirteen coaches went off the rails in that accident, which left three people dead and nine injured.

"About 10,000 km track on IR has clean ballast cushion less than 100 mm. Deep screening of these stretches should be dealt on priority and liquidated within a year," the minutes of the meeting read, directing all general managers to take the action required. According to railway standards, layers of ballast must have a thickness of at least 300mm.

Apart from poor maintenance and a staff crunch, shortage of ballast cleaning machines was cited as a major reason for deficiency of the ballasts.

Ballast acts as a cushion and absorbs the vibrations, ensuring the safety of trains on a railway track. The loads from the wheels of trains ultimately come on the ballast through rails and sleepers, with the thickness of the ballast sheet under the track decreasing over a period of time.

The thickness can be brought back by manual packing or mechanical tamping, one of the key maintenance works for the civil engineering department of the railways. As a convention, this standardisation is done once in 10 years on heavy traffic lines, said a top railway official.

Ballast deficiency is one of the root causes for weld failure, rail fracture, break in a fish plate or loosening of bolt and nut and track alignment variation, said D Manoharan, deputy general secretary of Dakshin Railway Employees Union, affiliated to the CPI(M). "It could lead to derailment or fatal accidents. When accidents occur, authorities focus only on the renewal of tracks and other related issues, but unfortunately, they don't look at the importance of maintaining ballast," he said.

When contacted, a senior official from the office of the chief track engineer in Southern Railway said that over a period of time, the ballast composition would get diluted with mud and vegetation.

"The thickness of the ballast varies based on the traffic. Ten thousand kilometres of track facing ballast deficiency shows the overall picture. When compared to over one lakh track kilometres across the country, this would hardly come to 10-12%," he said.

However, the deficiency should definitely be liquidated and the ballast should also be maintained as per the parameters, he added.

"These things will not directly contribute to accidents immediately. Ballast deficiency will directly affect the performance of train movement only in the long run," the official said.

Source: 

Adani group to foray into highway construction sector - Experts feel the timing of the company to enter the road construction sector is opportune as the infrastructure sector is on a growth path


The Adani group, which has a presence in the power, coal and maritime sectors, would soon foray into the highway construction sector.

According to sources in the know, the Gautam Adani (pictured)-led group is expected to carve out a subsidiary that would undertake the execution of highway contracts on an EPC (engineering-procurement-construction) basis. It may also bid for upcoming national and state highway contracts. An emailed query sent to a company spokesperson for confirmation, however, went unanswered.

Another group company Adani Capital Private on Thursday announced a Rs 100-crore investment in BSCPL Infrastructure Ltd, a Hyderabad-based EPC and development infrastructure company, focused primarily on the road sector. The funding in BSCPL would be retired out from the proceeds of a BOT (build-operate-transfer) asset sale post completion of the project.

Experts feel the timing of the company to enter the road construction sector is opportune as the infrastructure sector is on a growth path. “If they are planning a foray, it is a great opportunity for them to bid for hybrid-annuity and EPC contracts and even look at picking up equity in some of the highway monetisation projects,” an industry expert who did not wish to be quoted said.

The Cabinet Committee on Economic Affairs on August 3, 2016, had authorised the National Highways Authority of India to monetise 111 publicly-funded projects and a list of 75 operational projects was prepared for potential monetisation using the BOT model. The proceeds from these projects would be utilised for development, and for operations and maintenance of the highways.

At present, the Ahmedabad-based Adani group has presence in the power sector with an electricity generation capacity of 10,440MW. According to information on the firm’s website, more than 75 per cent of the electricity generated at Adani Power’s plants is pre-sold under long-term arrangements.

The company is developing and operating mines in India, Indonesia and Australia, as well as importing coal from across the world. Its thermal coal production stood at 11.75 million tonnes in 2016.


The group, through its company Adani Ports and Special Economic Zone, is present in seven port locations (Dahej, Dhamra, Ennore, Hazira, Murmugao, Visakhapatnam, Tuna-off Tekra, near Kandla) in India, apart from Mundra. The company plans to increase the annual cargo handled capacity from its ports to 200 million tonne by 2020. The port is expected to achieve 100 million tonnes or more of cargo handling capacity in the current financial year.

Source: 

Can computers help us synthesize new materials? - Machine-learning system finds patterns in materials “recipes,” even when training data is lacking


Last month, three MIT materials scientists and their colleagues published a paper describing a new artificial-intelligence system that can pore through scientific papers and extract “recipes” for producing particular types of materials.
That work was envisioned as the first step toward a system that can originate recipes for materials that have been described only theoretically. Now, in a paper in the journal npjComputational Materials, the same three materials scientists, with a colleague in MIT’s Department of Electrical Engineering and Computer Science (EECS), take a further step in that direction, with a new artificial-intelligence system that can recognize higher-level patterns that are consistent across recipes.
For instance, the new system was able to identify correlations between “precursor” chemicals used in materials recipes and the crystal structures of the resulting products. The same correlations, it turned out, had been documented in the literature.
The system also relies on statistical methods that provide a natural mechanism for generating original recipes. In the paper, the researchers use this mechanism to suggest alternative recipes for known materials, and the suggestions accord well with real recipes.
The first author on the new paper is Edward Kim, a graduate student in materials science and engineering. The senior author is his advisor, Elsa Olivetti, the Atlantic Richfield Assistant Professor of Energy Studies in the Department of Materials Science and Engineering (DMSE). They’re joined by Kevin Huang, a postdoc in DMSE, and by Stefanie Jegelka, the X-Window Consortium Career Development Assistant Professor in EECS.
Sparse and scarce
Like many of the best-performing artificial-intelligence systems of the past 10 years, the MIT researchers’ new system is a so-called neural network, which learns to perform computational tasks by analyzing huge sets of training data. Traditionally, attempts to use neural networks to generate materials recipes have run up against two problems, which the researchers describe as sparsity and scarcity.
Any recipe for a material can be represented as a vector, which is essentially a long string of numbers. Each number represents a feature of the recipe, such as the concentration of a particular chemical, the solvent in which it’s dissolved, or the temperature at which a reaction takes place.
Since any given recipe will use only a few of the many chemicals and solvents described in the literature, most of those numbers will be zero. That’s what the researchers mean by “sparse.”
Similarly, to learn how modifying reaction parameters — such as chemical concentrations and temperatures — can affect final products, a system would ideally be trained on a huge number of examples in which those parameters are varied. But for some materials — particularly newer ones — the literature may contain only a few recipes. That’s scarcity.
“People think that with machine learning, you need a lot of data, and if it’s sparse, you need more data,” Kim says. “When you’re trying to focus on a very specific system, where you’re forced to use high-dimensional data but you don’t have a lot of it, can you still use these neural machine-learning techniques?”
Neural networks are typically arranged into layers, each consisting of thousands of simple processing units, or nodes. Each node is connected to several nodes in the layers above and below. Data is fed into the bottom layer, which manipulates it and passes it to the next layer, which manipulates it and passes it to the next, and so on. During training, the connections between nodes are constantly readjusted until the output of the final layer consistently approximates the result of some computation.
The problem with sparse, high-dimensional data is that for any given training example, most nodes in the bottom layer receive no data. It would take a prohibitively large training set to ensure that the network as a whole sees enough data to learn to make reliable generalizations.
Artificial bottleneck
The purpose of the MIT researchers’ network is to distill input vectors into much smaller vectors, all of whose numbers are meaningful for every input. To that end, the network has a middle layer with just a few nodes in it — only two, in some experiments.
The goal of training is simply to configure the network so that its output is as close as possible to its input. If training is successful, then the handful of nodes in the middle layer must somehow represent most of the information contained in the input vector, but in a much more compressed form. Such systems, in which the output attempts to match the input, are called “autoencoders.”
Autoencoding compensates for sparsity, but to handle scarcity, the researchers trained their network on not only recipes for producing particular materials, but also on recipes for producing very similar materials. They used three measures of similarity, one of which seeks to minimize the number of differences between materials — substituting, say, just one atom for another — while preserving crystal structure.
During training, the weight that the network gives example recipes varies according to their similarity scores.
Playing the odds
In fact, the researchers’ network is not just an autoencoder, but what’s called a variational autoencoder. That means that during training, the network is evaluated not only on how well its outputs match its inputs, but also on how well the values taken on by the middle layer accord with some statistical model — say, the familiar bell curve, or normal distribution. That is, across the whole training set, the values taken on by the middle layer should cluster around a central value and then taper off at a regular rate in all directions.
After training a variational autoencoder with a two-node middle layer on recipes for manganese dioxide and related compounds, the researchers constructed a two-dimensional map depicting the values that the two middle nodes took on for each example in the training set.
Remarkably, training examples that used the same precursor chemicals stuck to the same regions of the map, with sharp boundaries between regions. The same was true of training examples that yielded four of manganese dioxide’s common “polymorphs,” or crystal structures. And combining those two mappings indicated correlations between particular precursors and particular crystal structures.
“We thought it was cool that the regions were continuous,” Olivetti says, “because there’s no reason that that should necessarily be true.”
Variational autoencoding is also what enables the researchers’ system to generate new recipes. Because the values taken on by the middle layer adhere to a probability distribution, picking a value from that distribution at random is likely to yield a plausible recipe.
“This actually touches upon various topics that are currently of great interest in machine learning,” Jegelka says. “Learning with structured objects, allowing interpretability by and interaction with experts, and generating structured complex data — we integrate all of these.”
“‘Synthesizability’ is an example of a concept that is central to materials science yet lacks a good physics-based description,” says Bryce Meredig, founder and chief scientist at Citrine Informatics, a company that brings big-data and artificial-intelligence techniques to bear on materials science research. “As a result, computational screens for new materials have been hamstrung for many years by synthetic inaccessibility of the predicted materials. Olivetti and colleagues have taken a novel, data-driven approach to mapping materials syntheses and made an important contribution toward enabling us to computationally identify materials that not only have exciting properties but also can be made practically in the laboratory.”
The research was supported by the National Science Foundation, the Natural Sciences and Engineering Research Council of Canada, the U.S. Office of Naval Research, the MIT Energy Initiative, and the U.S. Department of Energy’s Basic Energy Science Program.


Source: 

Burj Khalifa stands tall with ZINC protection

At an incredible 828 metres (2,716.5 feet) and more than 160 storeys, Dubai’s Burj Khalifa is the world’s tallest building.
Engineers, designers, architects and skilled workers from more than 100 countries were involved in the making of the building. The construction took 22 million man-hours, where more than 12,000 dedicated personnel ensured the seamless progress of the building.
Burj Khalifa’s primary structure is made of reinforced concrete. The  construction used 330,000 m3 of concrete and 55,000 tonnes of galvanized steel rebar. A cathodic protection system is in place under the concrete to neutralize the groundwater and prevent corrosion.
Due to the hot and humid outside conditions, an additional protective coating to the structure was required. Thus, the steel pipes used in the inner skeleton of the building were hot-dip galvanized along with the steel construction around the entrance to maximise protection.
The world’s tallest building is another testimony of the strength of Zinc. We all have Zinc in our lives.
Hindustan Zinc is India’s only and world’s leading Zinc-Lead-Silver Producer.


Source: 

KRCL, IIT Bombay sign MoU to aid tunnel tech institute


The George Fernandes Institute of Tunnel Technology established by Konkan Railway Corporation Ltd., at Madgaon, is set to get further strengthened with the support from Indian Institute of Technology, Bombay.

KRCL and IIT, Bombay, entered into a memorandum of understanding to this effect at Mumbai on Tuesday for technical collaboration. The corporation had already entered into an MoU with ETH Zurich, a science, technology, engineering and mathematics university in Switzerland this August.

While IIT, Bombay, is recognised worldwide for the high-quality education and for undertaking cutting edge research in various areas of science, technology, engineering, management and humanities, KRCL has been a leader in construction of transportation tunnels.

IIT Director D.V. Khakhar and KRCL Chairman and Managing Director Sanjay Gupta signed the MoU at the former’s premises.
The pact seeks to develop the tunnel technology institute into a world-class premier centre of knowledge in tunnel and underground structure technologies. It also seeks to provide opportunity to UG and PG students of the IIT to gain practical experience.


Source: 

Estimating Ground Motions In The Largest Crustal Earthquakes

Earthquakes cause many deaths and injuries worldwide. For example, the 2015 Nepal earthquake is thought to have killed nearly 9,000 people. This event was only one of many events contributing to the 750,000 deaths attributable to earthquakes during the period 1996 to 2015 according to a recent report by the United Nations.

This is a higher total than deaths caused by all other natural disasters (e.g. storms, heatwaves and floods) combined. Earthquakes also cause considerable economic losses. For example, the 2016 Kumamoto (Japan) earthquake is thought to have caused $42 billion-worth of damage according to the Japanese government. The largest earthquakes can also have dramatic and permanent effects on the landscape, e.g. roughly 180km of surface faulting occurred in the 2016 Kaikoura (New Zealand) earthquake.

The vast majority of earthquake-related deaths and injuries and economic losses are caused by damage (including total collapse) to buildings or their contents. When designing buildings and infrastructure (e.g. bridges and power plants) to withstand earthquakes, engineers require estimates of the ground motions that their structure could be subjected to in future earthquakes. These estimates are computed by engineering seismologists using two sets of mathematical models.
The first of these provides an assessment of the rate of occurrence of earthquakes of different sizes within a few hundred kilometers of the structure, e.g. how often does a magnitude 6 earthquake occur on a nearby fault? The second model predicts what the ground motion would be given the occurrence of different earthquakes, e.g. if the magnitude 6 earthquake occurred what is the expected ground shaking at the base of the structure? Many hundreds of these so-called ground-motion models have been published.

One of the greatest challenges in developing ground-motion models is a lack of recordings of previous earthquakes, particularly those within the region of interest. This challenge is particularly acute when considering the largest earthquakes because they happen so rarely. Generally, less than one earthquake of magnitude 7.2 or larger occurs in the Earth’s crust every year and is recorded at close distances by modern seismographic networks.

In our recent study, we collected data from 38 crustal earthquakes of magnitudes 7.2 or larger that had occurred worldwide over the past 60 years. Many of these data had been forgotten about and never collated before and hence they had not been used to derive the most recent ground-motion models. Because of its importance for seismic design and since it is often the only information available concerning ground motions in past earthquakes in our study we considered the maximum horizontal acceleration measured from each of the available seismograms. The newly-collated data were compared to predictions from eight ground-motion models that are routinely used to assess the earthquake shaking that a structure may suffer during its lifetime.
We found that these eight models provide, on average, a good match to the maximum horizontal accelerations observed in the largest crustal earthquakes. This is reassuring for earthquake engineering. The study, however, did show that the ground motions in some earthquakes (e.g. the 2001 Bhuj earthquake in India) were much higher than those that would be expected given the magnitude.

The data collated for this study will be invaluable for the derivation of new ground-motion models for use in the design of safer structures and consequently for the reduction of earthquake risk.


Source: 

Koyna earthquake dented irrigation development, dam technology

The Koyna earthquake in Maharashtra took a death toll of about 200 people on this day 50 years ago. Though the earthquake caused only minor damage to the Koyna dam and some houses in the vicinity, it is perhaps the only scientific event in India which affected science, society and engineering. Till the quake, peninsular India was considered aseismic and most big civil engineering projects were constructed under this assumption.

For the engineering community, the Koyna accelerogram - which had recorded 0.67 g acceleration - was a boon. This was the first accelerogram in India. Prior to this, most dam engineers and designers were adopting the Californian accelerogram for design purposes. This has been used for the construction of several dams, atomic power plants and refineries in India.

Immediately after the earthquake, there was no scientific explanation for the seismic contingency. Some observations were made by scientists, engineers and the common man. The educated community, which included the scientists and engineers, said the event could be studied properly. They found that the area and the Konkan region had been experiencing earthquakes and that one such quake had occurred around 400 years ago. However, their observations were totally eclipsed by the wild imagination of the common man.

The strong belief was that the construction of the Koyna dam was the main reason behind the earthquake. A new acronym, RIS (reservoir induced seismicity), was coined. Advocates of this theory proposed two possible causes for the quake: one, water in the reservoir had trickled down and lubricated the underground fault causing the quake and, two, the weight of the water in the reservoir was far higher than the loadbearing capacity of the rock. A number of seminars, conferences and functions on RIS were held during subsequent years. Some narcissistic scientists also joined the RIS bandwagon without realising that they were batting on a weak wicket.

The two hypotheses were negated with scientific facts. When the lake-tapping experiment was conducted, it was found that there was no water or moisture in the rock. Computer programmes were run to find the effective load on the rock. It was found that the load due to the water body was around 3.5-4 kg per sq ft - far lower than the loadbearing capacity of the rock. In fact, the late Dr K L Rao, an eminent engineer, scholar and then Union minister, compared the Koyna reservoir sitting on the rock to a fly sitting on an elephant.

With RIS being unacceptable to a large number of scientists, it was renamed over the years as RAS (reservoir associated seismicity) and RTS (reservoir triggered seismicity) before being finally abandoned.

Apart from the debate it triggered among seismologists, the Koyna earthquake caused severe damage to the credibility of irrigation development and dam technology. Taking a wrong cue from the earthquake, the Narmada project was challenged in various courts and protest marches were held in several cities. The project was delayed for 11 years before it could be commissioned.

Likewise, the construction of the Tehri dam was also opposed. The main reason for opposing the project was, again, that it might induce an earthquake. Now, whether a dam is constructed or not, the Himalayan region will always have some earthquake activity. I remember visiting several hydro-electric projects in the country to install seismological instruments in and around the dam or project area.

After their failure to block the Narmada and Tehri projects, the protesters took their opposition to the Subansiri dam in Assam. Here, the protest took a somewhat dangerous turn bordering on violence. The protesters even sat on the rail tracks to block a train carrying equipment and machinery for the construction of the dam.

It has now become almost regular practice to object to the construction of dams, irrigation projects, hydro-power projects or atomic power plants. Considering the increasing demand for power, we need more dams and hydro-electric and solar power generators. 
Presently, some Himalayan states such as Sikkim, Uttarakhandand Himachal Pradesh are developing mini, micro and medium hydro-electric projects. In addition, Bhutan and Nepal are providing India with a good amount of electricity generated at hydro-electric projects. In the Sahyadri mountainous region of Konkan in Maharashtra, about 2,200 sites have been identified for micro hydel power projects similar to Himachal Pradesh. But neo-environmentalists are opposing these projects saying they would have an adverse impact on the environment.

In fact, opposing developmental activities happens regularly, be it over the construction of a flyover in the Malabar Hill area of Mumbai, road-widening in a city or building of an airport or highway. People oppose projects and fight a legal case right up to the Supreme Court, which leads to a delay and rise in project cost.


I sometimes wonder whether this trend to oppose projects had its root in the aftermath of the Koyna earthquake. However, wise men try to convince me that the opposition is mainly politically motivated. Contemporary history tells us that till 1975 or so, no developmental project was opposed. I remember how the Bhakra and Hirakud dams, which Jawaharlal Nehru had announced as new temples of development, were jubilantly welcomed by everybody.

Source: 

Top 10 Free and Open Source Construction Management Software - Construction Management or Construction Project Management Software is one of many types of computer programs that helps manage, organize, and document many aspects of a particular job or project.


Scheduling, finance, and production software may be needed to effectively coordinate and implement all aspects of a job. In some cases, these elements can be found in a single software system.
Here are the top 10 Free and Open Source Construction Management Software:

2-PLAN

2-plan Project Management Systems – Free and Open Source Software. 2-plan Project Management Systems offers three PM tools: a free desktop system, an open-source software for multiple projects and teams, and a scrum board.

EFFICIENTSOFTWARE

Free Personal Organizer Software for Windows – Best Calendar & Planner & Organizer Software – Free Download. Efficient Calendar Free is an easy-to-use, professional and award-winning free construction scheduling software application that can help you plan and manage your time.

Estimate

Estimate is an Open Source web based Construction Cost Estimating Software designed for medium and large Civil Construction and EPC (Engineering Procurement and Construction) companies. Features include Management of Schedule of Rates, Analysis of Rates, Project Estimation (Definitive and Control), Tender Evaluation, Cost Sheet preparation, BOQ Generation, Audit and Projection.

ESTIMATORAPPLICATION

Estimating software for the construction industry. With this free program you can create estimations for constructions, print an estimation and a proposal.
Characteristics of the application:
– default cost type & group library;
– project cost type & group library;
– copying of substantiations from different estimations;
– copying of contract specifications from different estimations;
– export to and import from estimations and cost type libraries from and to
excel;
– print an estimation;
– print a proposal;
– it is portable;
– it is free.

TAIGA

Taiga is a project management platform for agile developers & designers who want a simple, beautiful tool that makes work truly enjoyable.

WRIKE

Your online project management software – Wrike. Wrike is an online project management software that gives you full visibility and control over your tasks. With the help of our product, managing projects becomes easier. Our project management tools include time tracking, project planning and organization, an interactive timeline, communication and online collaboration features for teams of any size.

ORANGESCRUM

Open Source Project Collaboration Tool | Task Management Software. Orangescrum is the open source free project management and collaboration tool, helps you to manage projects, team, tasks at one place including time tracking, invoice generation and more.

OPENDOCMAN

OpenDocMan is a free, web-based, open source document management system (DMS) written in PHP designed to comply with ISO 17025 and OIE standard for document management. It features web based access, fine grained control of access to files, and automated install and upgrades. Please have a look around and if you have any questions, contact us!

Open Workbench

Open Workbench is a desktop application for project management and scheduling in which you can define a work breakdown structure, set dependencies and resource constraints, assign resources to tasks, auto schedule and then monitor progress.

GENIEBELT

GenieBelt – Construction Project Management Software & App. Simple and affordable construction project management software & construction app. Try Gantt charts, Communications, Project Overviews & more.
The above Free and Open Source Construction Management Software are by no means the only ones existing. In the ever evolving technology world, more software are being created.


Source: 

TS moots part-time courses for technology students - TS Perspective Plan on technical education identifies 14 thrust areas

The Telangana Government wants to introduce part-time education opportunities for under-graduate technology students which will equip them with additional skills even while pursuing their degree courses.
The idea is to offer certification and capsule courses in a variety of fields connected to their core subjects so that students can gain employment while studying or at least start earning immediately after their graduation.
This was stated in the Perspective Plan of Technical Education in Telangana State (PPTE), prepared by the Telangana State Council of Higher Education (TSCHE). The same was recommended to the All India Council for Technical Education (AICTE) that gives permission to new courses for technical institutes in the country.

Rural background
The reason cited was large number of students coming into technical education are from poor socio economic background and also from rural areas where lack of learning lifeskills for the new era is almost nil. “To support their family they need to work as soon as they complete their course of study. To upgrade their qualification while working, there is a need for part-time education opportunity, as is offered at PG level,” TSCHE Chairman T. Papi Reddy said.
Professor Reddy said such short-term courses existed earlier and the AICTE has now been requested to re-introduce them. However, these courses should be in fields where there is some demand. For example for students of CSE and IT, there are opportunities at smaller level Graphic Design and Web Design or Information Security and Ethical Hacking, Animation and Multimedia, Digital Marketing or Networking.
Even if engineering graduates don’t get campus placements immediately after their courses, such short-term courses can give them employment for a certain period even as they continue trying to get into good companies.

Temporary placement
“Moreover, such experience also helps them. Instead of students going to private institutes AICTE should make it mandatory to offer courses for interested students even as they concentrate on core subjects,” he said.
In fact, the Perspective Plan has identified some key areas where Telangana in particular may need trained professionals to exploit its resources.
The PPTE suggested to the AICTE to sanction programmes in Mining, Granite, Textile, Pharmacy, Automobile, Construction Technology based on new technologies and needs of the industry keeping in view the 14 thrust areas it has identified in the document.
These 14 thrust areas identified include Life Sciences – including Bulk Drugs, Formulations, Vaccines, Nutraceuticals, Biologicals; Incubation Centres, R&D facilities, and Medical Equipment; IT Hardware including Bio-Medical devices, Electronics; Precision Engineering, including Aviation, Aerospace, and Defence; Food Processing and Nutrition Products including Dairy, Poultry, Meat, and Fisheries; Textiles and Apparel, Leather and Leather value-added products like Shoes, Purses; Gems and Jewellery, Renewable Energy and Solar Parks; Mineral-based and Wood-based Industries among others.

Source: