top of page

Search Results

27 items found for ""

  • Unity UDP

    Create once, publish everywhere Hey everyone, Unity has just started a new but very user-friendly feature for its developers. We as developers put a lot of mind and effort in designing and developing our games or any other kind of project on Unity. Publishing to various stores takes another good crunch of time that can be instead used in making games better or learning new technologies. For every such person or team, Unity has given the feature of publishing to multiple stores in one go. Currently, four stores are part of Unity Distribution Portal, soon more will join. It thus reduces the complexities associated with publishing on separate app stores. And also giving the opportunity to grow and connect with millions of players worldwide. Participating UDP stores Catappult MOO Store Jio Games Store One Store Some important points about UDP UDP is entirely free for developers. UDP work with Unity supported Android mobile form factors. UDP does not support Hardware-related optimizations. You can select all or few stores as per your wish. UDP won't handle the payouts by participating stores. That needed to be done separately. UDP won't block any 3rd-party SDKs for tracking, optimization, ads, etc. As or now, UDP is only supporting games made with Unity. Reference: https://distribute.dashboard.unity.com/udp/homepage

  • Hololens 2

    In few words, Hololens 2 is an all-in-one Mixed Reality device which has apps and solutions to take business to a new level. It is successor to the Microsoft Hololens. And it is here to do business than being just being in an experimental phase. HoloLens 2 offers much comfortable and immersive mixed reality experience. With the Cloud and AI services from Microsoft, it is reliable, secure and scalable. What's new to Hololens 2 Immersive We can see mulitple holograms at once through the greatly increased field of view. 3D Images can be seen more clearly with good resolution and get into minor details of 3D models. Ergonomic Easy to wear and switch. It has dial-in fit system that makes you wear if longer and confortably. You can wear it with you glasses on, and slide the headset glasses on top of it when required. To step out of mixed reality to switch tasks, just flip the visor up. Instinctual We say Mixed Reality when we can touch, grasp, and move holograms in natural ways. It is definitely possible through Hololens2. You cannot only handle virtual object like real , nut they also respond a lot like real objects. Another amazing thing is that you can give voice commands and give instructions even in noisy industrial environments. Untethered Hololens2 sets you to free to move with no wires attached. As we said it is all-in-one, it is a computer in itself with Wi-Fi connectivity. What you can do with Hololens 2 Empower your business with industry ready apps and solutions There are already apps and solutions available to use Hololens 2 for your business and empower your employees . Some are as follows: Bentley Synchro PTC Vuforia Studio Philips Azurion Trimble XR10 with HoloLens 2 Start developing solutions Compatible with the famous tools like Unity, Unreal, Vuforia. It is following open standards like Khronos and OpenXR. Reference: https://www.microsoft.com/en-us/hololens

  • Stadia

    Stadia is a cloud gaming platform by Google. A browser based cloud service that is specifically for game players and game developers. Since it is browser based, what you need to work on it is good internet connection and Google Chrome browser. Google has a large number of data centers across the globe, and so Stadia has already advantage over other cloud gaming services as most players would be geographically close to a data center. Plus points Stadia supports the streaming of games in HDR at 60 frames per second with 4K resolution, and plans eventually reaching 120 frames per second at 8K resolution. Time has gone when players need to have good hardware and softwares downloaded to play games. Stadia makes it so easy for players to play online and share their live sessions directly on YouTube. To reduce input latency, Google has developed its own controller that will connect independently to the Google datacenter through Wi-Fi. But Stadia supports any HID-class USB controller as well. Google has also attached its AI service ie. Google Assistant with Stadia, that would automatically search for relevant videos related to the game user is playing. It support all device types effortlessly TVs, laptops, desktops, tablets and smartphones. Hardware Stack Stadia has powerful Hardware that is the need for establishing such platform. Custom 2.7 GHz hyperthreaded x86 CPU with AVX2 SIMD and 9.5 MB L2+L3 cache Custom AMD GPU with HBM2 memory and 56 compute units capable of 10.7 teraflops 16 GB of RAM with up to 484 GB/s of performance SSD cloud storage Software Stack Build on Linux OS, Stadia is using Vulkan, i.e. Next-gen cross-platform graphics and computing API with custom layers optimized for cloud-native gaming. It has APIs for managing games, like saves, multiplayer modes, suspend/resume gameplay, etc. Developer Tools All your favourite game development tools are already part of Stadia, two most important are Unreal Engine and Unity. Others include Havok®, RenderDoc, Visual Studio, LLVM, AMD RadeonTM2 GPU Profiler, IncrediBuild, UmbraTM 3, FaceFX and Intelligent Music Systems. The best part is still to be told and that is you would never need to update your software / hardware because of cloud based platform, and still you will be always up-to-date. Cool.. References: https://stadia.dev

  • 3dRudder Motion Controller

    Foot powered motion controller is here to give you much immersive experience of PS4. It is going be launched on 17th June,2019. When an architect (Valerio Bonora, who is now a co-founder of 3DRudder), found it difficult to zoom-in and out, rotate and had pain in his wrist, he came out with the idea of foot-powered motion controller. Here is the link : https://blog.eu.playstation.com/2019/04/04/an-in-depth-look-at-the-3drudder-motion-controller-for-playstation-vr-launching-this-summer/amp/ With your feets set and going on 3DRudder, you can control your movements through your feet and hence your hands are free to attack and do all other interactions. On the top of the device, there is a slim circular platform onto which you rest your feet. This platform rests on a rounded bottom half. To move in any direction, move your feet in the direction. The more angled the tilt, the faster you will move. You can walk straight, sprint while taking a corner or stop exactly where you want to, in a smooth, seamless way. There are no sticks and no buttons. There are two sensors IMU (Inertial Measurement Unit) and pressure sensors to track movements and convert them into in-game motion. It can be easily connected with your PS4 through USB. The device is compatible (and combinable) with PlayStation Move motion controllers, the PS VR Aim controller, and Dualshock 4. Things to follow this technology: It is surely going to change the game for VR/AR gamers. It would be exciting to see how it fits and change the VR/AR enabled world now.

  • Nissan AR Concept

    Image Source: https://nissannews.com At CES 2019, Nissan introduces a ground-breaking concept called 'Invisible to Visible' or I2V. This technology is going to change the world of driving cars. The broad idea of this concept is to make driving amazingly easy by giving informations that are not visible or accessible through human eye, ears or other senses. For example, it can tell you that someone is about to cross the road with approximate distance, in advance. Another example it would tell you about small vehicles like bicycles or bikes when they are about to overtake from the side. It would inform you about any obstacle like falling tree on the road. What I found unbelievable use of AI and Image processing is that, if there is quite foggy and one is not able to see clearly see through windshield, this technology is processing the foggy image and changing it to another image of the place that would be same as the image if there has been no fog. And the story do not finish here, you can also get a virtual driving companion that could be anyone, that may not be physically there with you but it will be virtually there, so as to increase human interaction. You can also put on a mode where a pro-driver will appear with his car that would move ahead of you continuously and give you personalised driving instructions. Ofcourse, last but not the least, it would tell you the optimum route to your destination, and guide you through free parking spaces available or about to be free. Few years ago, when virtual and augmented reality seemed to lose its pace because of various reasons. But now is the time, when it has come back very mature and ready to take upon the commercial market in every sector. Many more things to come and change the world around you. Keep coming back to the blogs for more such info. Demo : https://www.youtube.com/watch?v=x2mvhhjoPU4

  • Amazon Sumerian

    Amazon Sumerian is a complete browser based game engine platform. While Unity needs to be installed on the computer, Amazon Sumerian needs no installation. Other than this basic difference, there are many ready-to-use features available on Amazon Sumerian. This platform has tried to make game development easy for anyone who is not very expert in technical know-how. It lets you create and run virtual reality (VR), augmented reality (AR), and 3D applications quickly and easily without requiring any special programming or 3D graphics expertise. One can build very immersive and interactive scenes that can run on popular hardware such as Oculus Go, Oculus Rift, HTC Vive, HTC Vive Pro, Google Daydream, and Lenovo Mirage as well as Android and iOS mobile devices. Recently there was a challenge going on called 'Amazon Sumerian Challenge' in which anyone can make a project and submit it using this platform. Delhi Technology Club participated and submitted a small project called 'Musicolge' meaning Music + Knowledge. It was made to teach language of music i.e. Staff, Note Values, Time Signature. The purpose was to make an interactive demo that makes it very easy for player or operator to get a good idea of reading music sheets. How we built it : We bulit it using Amazon Polly, Amazon Cognito, Sumerian, Virtual Reality. Much to our surprise Sumerian has some ready to use characters and a library of their expressions with animation. One can add the character(s) to their project and make a state diagram of their animation defining a sequence, one can also add audio to the state diagrams. So, one do not need to worry about the characters and their animation much, and they can focus on their main purpose of the project. More info : https://aws.amazon.com/sumerian/ Following is the link to our project: https://www.delhitechnologyclub.com/musicolge

  • Realtime or Rendered

    This year during GDC keynote, Unity team showcased a teaser of upcoming short film named 'The Heretic'. There was a man in the teaser walking through a cave like place. Amazingly, the man was not a real man, it was unbelievably rendered so as one cannot believe his eyes. The complete scene from the face of the man, to the shine of the water on the ground to his eye movement, everything was completely rendered yet equally real. The demo shows Unity’s next-level rendering capabilities in graphics. To create this demo, Unity has performed some customizations above their Scriptable Render Pipeline (SRP) rendering architecture.The demo is built on Unity 2019.1. This was also Unity's first ever demo that showcases real-time digital humans. The team has used both 3D and 4D scanning data and combined them, and then built a complete pipeline from data acquisition to real-time rendering. For rendering to the perfect look, the team has created the shades on the basis of Unity's HDRP to achieve the equal to real look. More Info :https://blogs.unity3d.com/2019/03/19/the-heretic-megacity-release-real-time-ray-tracing-and-more-news-from-gdc-2019/

  • Microsoft azure Kinect DK

    Microsoft Azure Kinect DK where DK stands for Developer Kit, is a big step from Microsoft towards the world that would depend upon the accuracy of lens and sensors. The hardware has advanced AI sensors that are meant for sophisticated vision and speech requirements. It is targeting the future of image processing. It is not targeting end consumers. It is usable for developers and commercial businesses who can think of an idea and use the amazing capabilities of the SDK to convert their idea to a product. It has Sensor SDKs, Body Tracking SDK, Azure Cognitive Services. Sensor SDK: It has access and control of Depth camera , RGB camera , Motion sensor (gyroscope and accelerometer), Device calibration data and many other. One can easily imagine the power that the sensor carries in itself. It can be used for motion capturing of anything or person with great detail, track it, give results that can be manipulated and used in various ways. Body Tracking SDK: The Body Tracking SDK includes a Windows library and runtime to track bodies in 3D using Azure Kinect DK hardware. This SDK includes the following features like Body segmentation, uniquely identifying persons, ability to track bodies through time. They are also providing a viewer tool to track bodies in 3D. Cool na.. Azure Cognitive Services: This would complete solutions. For example, you can utilize voice controls to interact with your product which scans objects for dimensions and labels. Supported operating systems and architectures are: 1. Windows 10 April 2018 release (x64) or later 2. Linux Ubuntu 18.04 (x64) with OpenGLv4.4 or later GPU driver More Info: https://azure.microsoft.com/en-in/services/kinect-dk/

  • Enflux

    Enflux is a one of those first products that enables Real Time Motion Capturing using a Suit. We all have seen Marvel Studio's movies or movies like Real Steel, where your body movements control a virtual 3D model or a robot. Enflux product consists of a shirt and pant that has 10 sensors attached to it. When a person wearing the suit moves, these sensors sense the movements and communicate the movements through bluetooth to software in your computer that in turn moves a 3D model same as you move in real time. Enflux supports Blender and Unity for Windows. It means that if one has 3D model in Blender or Unity in Windows OS, he/she can move the 3D model with his/her own movements in Real Time with Enflux, after doing some configurations obviously. Applications are limitless, it can be used almost everywhere where we use our body to do anything. Be it Yoga, Dance, Exercises, Repairing things, etc. Further, if you can control a Robot with the 3D model, you can do everything that the small boy does with the robot in the movie Real Steel, yeah.. Well here are some limitations that Enflux do not track all your movements like you fingers, toes, heels, face, neck, etc. It is limited by its sensors. With influx the other devices also that have come up. There are devices that are doing image processing and hence they are not limited by their sensors. So it would be interesting to see like what what is there in future. And what would succeed this field of real-time motion capturing. More Info : https://www.getenflux.com/

  • 3 Generations of Solar Cells

    Here is sneak peak about generation of solar cells. Solar cells are currently in its THIRD generation. And believe it or not, solar panels installed around us actually belongs to FIRST generation. Second and third generation differs in the sense that they are flexible and have altogether different manufacturing process. Check out our research on Organic Solar Cells ⭐️

  • Solar Power

    Nature is the most intelligent of us all. Everything that we understand about nature is science and what we don't, it is magic. With such changes in the climate, we are left with no alternative other than taking nature's help. Harnessing renewable energies is the answer. We are aware of renewable energies like Wind, solar, bio mass, geothermal, tidal and hydro. But, why it is that solar energy taking the lead. Huge research going on how to harness this form of energy more efficiently. There is big reason behind that. Yes we can generate 10000 Tera Watts(technical value)

  • IOS 11 on IPAD Pro

    Multitasking is a need of an hour in his busy world. Thanks to the ever-evolving technology which ease our lives and make us capable to keep the pace with hectic schedule. iOS11 is one of the useful aid which revolves around Persona of MULTITASKING. You must be wondering what I am talking and how iOS 11 is working at the Multitasking level. Yes, your excitement is at the right chord as this new iOS with version 11 is launching. This is completely the replacement of PC. Dock is now available from any screen. You can open the same & move to any other app instantly. A new feature is introduced which suggests the “Recent Apps”. Split view is launched in which one can view the app in split form and additionally in this view one can drag & drop text, links, photos, files, and more to other apps. iOS 11 has introduced a “FILE APP” in which stored files on any of your devices like iPad, iCloud and Google Drive can be viewed and its so easy to open the file with this “FILE APP”. Vertical sliders are there to use for volume and brightness control, which are ease in operation.Music can be played from the dock while be on the same screen. New Advanced features Playing with buttons by swiping down the button give new dimension to enhance the user experience as no need to press shift key to type special characters(₹,@…) Another app namely “NOTES APP” is improvised, In this documents can be scanned, Table Insertion is possible and adding Grid Lines in the NOTES, that redefine the level of ease, presentation as this improvised version is very simple to operate and very easy to write yet impactful and with its formatting options one can change the font, usage of bold/italic the font is possible, add bullets, alignment of the content can be done, that makes the notes in the presentational and shareable form.

bottom of page