We conduct University and college based
Tech shows, Workshops and seminars on different technologies platforms and tools.
We cover Different Areas Like
NS-2: Ns is a discrete event simulator targeted at networking research. Ns provides substantial support for simulation of TCP, routing, and multicast protocols over wired and wireless (local and satellite) networks. NS-2 is a common network simulator it is Open Source Simulator (Gnu Licensed). NS2 is publicly available at http://www.isi.edu/nsnam/ns/ns-build.html
Wireless Network Simulation Areas are
• Ad-hoc Network
• Sensor Network
• Mesh Network
Research focus in Wireless Network are
• Routing protocols
– Reactive, proactive, hybrid
• Cluster management
– To reduce overhead, to facilitate network management, to enable QoS, etc.
• Quality of service (QoS)
– Differentiating among different types of applications
• Medium access
– Closing the link, recognizing neighbors, scheduling transmission, etc.
NS-3: ns (from network simulator) is a name for series of discrete event network simulators, specifically ns-1, ns-2 and ns-3. All of them are discrete-event network simulator, primarily used in research and teaching. ns-3 is free software, publicly available under the GNU GPLv2 license for research, development, and use. The goal of the ns-3 project is to create an open simulation environment for networking research that will be preferred inside the research community. In the process of developing ns-3, it was decided to completely abandon backward-compatibility with ns-2. The new simulator would be written from scratch, using the C++ programming language. Development of ns-3 began in July 2006. A framework for generating Python bindings (pybindgen) and use of the Waf build system were contributed by Gustavo Carneiro. The first release, ns-3.1 was made in June 2008, and afterwards the project continued making quarterly software releases, and more recently has moved to three releases per year. ns-3 made its fifteenth release (ns-3.15) in the third quarter of 2012.
Current status of the three versions is:
• ns-1 is no longer developed nor maintained,
• ns-2 build of 2009 is partially maintained but not being considered for journal publications
• ns-3 is actively developed.
Digital Image Processing: it is the use of computer algorithms to perform image processing on digital images.DIP has many advantages over analog image processing. It allows a much wider range of algorithms to be applied to the input data and can avoid problems such as the build-up of noise and signal distortion during processing. Since images are defined over two dimensions (perhaps more) digital image processing may be modeled in the form of multidimensional systems. There are different tools used for DIP like Mat Lab, SciLab, IDL etc.
Raspberry Pi: is a low cost, credit-card sized computer that plugs into a computer monitor or TV, and uses a standard keyboard and mouse. It is a capable little device that enables people of all ages to explore computing, and to learn how to program in languages like Scratch and Python. It’s capable of doing everything you’d expect a desktop computer to do, from browsing the internet and playing high-definition video, to making spreadsheets, word-processing, and playing games.
What’s more, the Raspberry Pi has the ability to interact with the outside world, and has been used in a wide array of digital maker projects, from music machines and parent detectors to weather stations and tweeting birdhouses with infra-red cameras. We want to see the Raspberry Pi being used by kids all over the world to learn to program and understand how computers work.
Cloud Computing: In Computer science aspect , cloud computing is a synonym for distributed computing over a network and means the ability to run a program on many connected computers at the same time. More commonly, used to refer to network based services which appear to be provided by real server hardware, but which in fact are served up by virtual hardware, simulated by software running on one or more real machines. Such virtual servers do not physically exist and can therefore be moved around and scaled up (or down) on the fly, without affecting the end user – arguably, rather like a cloud.
Data Mining: Data mining is the process of extracting patterns from data. It is commonly used in a wide range of profiling practices, such as marketing, surveillance, fraud detection and scientific discovery. Data mining an interdisciplinary subfield of science, is the computational process of discovering patterns in large data sets involving methods at the intersection of artificial intelligence, machine learning, statistics, and database systems. The overall goal of the data mining process is to extract information from a data set and transform it into an understandable structure for further use. Aside from the raw analysis step, it involves database and data management aspects, data pre-processing, model and inference considerations, interestingness metrics, complexity considerations, post-processing of discovered structures, visualization, and online updating.