Announcement

Collapse
No announcement yet.

Lets give the Kubuntu Devs a dash of income

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Lets give the Kubuntu Devs a dash of income

    I think one of the main things the community hates about the Ubuntu dash is that it is not a transpareny, opt-in service. It comes across as sneaky and creates fears of invading your privacy. I do not know how bad it is but from what I read it is not as bad as people seem to think.

    Be that as it may: How would the Kubuntu users feel about an internet "product" or "service" search functions which is a completely separate, independent application from the default built-in search?

    I suggest an application which
    a) is a separate application from the default local system search, and easily installable/un-installable as a separate package.
    b) offers a way to set it as the default search for those who want to opt-in.
    c) is easy to configure where you want it to search (Eg check-boxes to enable searching online on various sources)
    d) encourage users to use it by allowing one to easily toggle a search as "private" or "local only" or some such terminology.
    e) encourage people to use it by being open about the fact that the Kubuntu project earns revenue when it is used.

    #2
    I guess I do not understand why the current search methods are not adequate and very good, local search and separate online search, ie Google or whatever your prefer is. What are the benefits of a sigle all encompassing search feature other than a fractional time saving by using one application?
    Linux because it works. No social or political motives in my decision to use it.
    Always consider Occam's Razor
    Rich

    Comment


      #3
      I am sure there are benefits in having a unified search interface. I imagine that it would give an opportunity to present found items in new and innovative ways.

      The idea though is that the user gets some convenience and that at the same time there is a new, passive (from the end user's perspective) stream of income generated for the developers.

      When I want to shop online (as opposed to when I want to reseach some subject) I could see myself prefering a search bar which is already loaded (running) rather than having to start a browser or open a new tab for Amazon + Ebay + Google Play + ...

      If the usage is at least equally convenient I would use the one that earns an income for Kubuntu. Maybe I just feel guilty for getting a whole operating system and program set for free!

      Comment


        #4
        Just my preference, but I would prefer direct contributions to Kubuntu rather than indirect ones.
        Linux because it works. No social or political motives in my decision to use it.
        Always consider Occam's Razor
        Rich

        Comment


          #5
          OK, but that still leave my original question unanswered.

          If there was an opt-in search feature... an optional extra, would you have any issue with that?

          P.S. I do not see these methods of contributing as being mutually exclusive.
          Last edited by Tahaan; Nov 21, 2013, 09:28 AM. Reason: Added P.S note

          Comment


            #6
            One of the early features of Linux was that instead of a "kitchen sink" approach to applications, it used programs which did specific things very well and very fast. They were called "utilities". Awk, sed, grep, find, whois, less, tar, mv, ls, chown, chgrp, attr, ip, lookup, mount, umount, ln, mkdir, pwd, mknod, locate, updatedb and others. They are small and lightening fast! They had to be both. Tar, the biggest of the group, is only 360Kb, and it is still one of the best backup/restore utilities available.

            Back in the day memory and HD space was at a premium. With 512Kb of RAM and 40Mb of HD one could not afford to have programs that were twice as big as the RAM in the machine ... too much memory/page swapping made them run slow, and the HD couldn't hold very many apps that big and leave room for the OS and other stuff.

            Now, things have changed. This laptop has 750Gb HD and 6GB of RAM. It has 8 cores and they run at 2.16GHz, 3GHz in turbo mode, a thousand times or more faster than the 486 PCs. It is not uncommon to have a 128 TiB core file. Some libraries and dozens of applications exceed 1Mb and many apps exceed 10Mb, 20Mb or 30 Mb in size.

            I love KDE, and it gives me a LOT of power, especially when it comes to graphics, mounting or dismounting devices, dragging and dropping files and folders. But, when it comes to searching for files, I find that locate is significantly faster than Dolphin's file search function, returning results in a second or two, instead of minutes, and the use of regex is faster as well. I still use many of the cli utilities I listed above.

            Another thing I've noticed is the use of dynamically linked libraries. BECAUSE RAM and HD space is not a significant issue I'd love to see applications become statically linked. For those without a background in programming the means that the application contains within itself ALL of the libraries it calls, or at least the functions of those libraries. A dynamically linked app might be only 500Kb to 5Mb in size. The same app might be 10 times larger. A 50 Mb app? Skype is already 32Mb and it is dynamic. So what if it were 320Mb, if one didn't have to have ia32 libs installed. A thousand such apps would take 320Gb of HD. So? Running statically linked apps would sure eliminate the "incompatible library" problem. Nothing was more aggravating than to have the developer link against libraries on his system that weren't yet available to the user and then the user not being able to find them, or to find the right version. Debian's package system eliminates that if one stays within the repository, but it wouldn't be a problem with statically linked apps. What about running more than one statically linked app at a time, say two or three apps that are 250Mb each? What happens now? We have time slicing and page swapping as programs are shifted in and out of memory as the user switched between two or more apps running simultaneously.


            But, all this is a pipe dream. The fact is that in the shift between lib5 and libc6 support for static linking in the gcc compiler was severely hampered. Static linking won't work without the use of the Name Service Switch (controlled by /etc/nsswitch.conf) and NSS requires linking to shared libraries! Configuring glibc with --enable-static-nss won't work because if you've got to link every static program that uses NSS routines with all NSS and resolv libraries. So, static linking on Linux is highly problematic, if not entirely dead.
            "A nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.”
            – John F. Kennedy, February 26, 1962.

            Comment


              #7
              Originally posted by Tahaan View Post
              OK, but that still leave my original question unanswered.

              If there was an opt-in search feature... an optional extra, would you have any issue with that?

              P.S. I do not see these methods of contributing as being mutually exclusive.
              No I would not as long as the default was no opt in and there were no nag screens.
              The difference is subtle in the methods of contributing. Direct is a conscious contribution, the other, after the initial opt in, requires no thought. Buy maybe I am nitpicking.

              @GreyGeek. You completely lost me with your response. I may be dense but do not see the connection to the OP's suggestion.
              Last edited by richb; Nov 21, 2013, 10:44 AM.
              Linux because it works. No social or political motives in my decision to use it.
              Always consider Occam's Razor
              Rich

              Comment


                #8
                Originally posted by GreyGeek View Post
                Another thing I've noticed is the use of dynamically linked libraries. BECAUSE RAM and HD space is not a significant issue I'd love to see applications become statically linked. For those without a background in programming the means that the application contains within itself ALL of the libraries it calls, or at least the functions of those libraries. A dynamically linked app might be only 500Kb to 5Mb in size. The same app might be 10 times larger. A 50 Mb app? Skype is already 32Mb and it is dynamic. So what if it were 320Mb, if one didn't have to have ia32 libs installed. A thousand such apps would take 320Gb of HD. So? Running statically linked apps would sure eliminate the "incompatible library" problem. Nothing was more aggravating than to have the developer link against libraries on his system that weren't yet available to the user and then the user not being able to find them, or to find the right version. Debian's package system eliminates that if one stays within the repository, but it wouldn't be a problem with statically linked apps. What about running more than one statically linked app at a time, say two or three apps that are 250Mb each? What happens now? We have time slicing and page swapping as programs are shifted in and out of memory as the user switched between two or more apps running simultaneously.

                But, all this is a pipe dream. The fact is that in the shift between lib5 and libc6 support for static linking in the gcc compiler was severely hampered. Static linking won't work without the use of the Name Service Switch (controlled by /etc/nsswitch.conf) and NSS requires linking to shared libraries! Configuring glibc with --enable-static-nss won't work because if you've got to link every static program that uses NSS routines with all NSS and resolv libraries. So, static linking on Linux is highly problematic, if not entirely dead.
                Are you saying a unified search is a kitchen sink approach? I would rather say it is a small, fast utility that aggregates the output from multiple sources. Think of it as a script that does: grep;grep;grep; join;join;nroff; pg;

                But I must say that the benefit of statically linked binaries lies in a different direction. Incompatibilities are because developers broke things, and there are ways to work around that. But for 99.999% of users that is the exception because packaging, particularly dependency definitions sorts these issues out... for the most part.

                The issue most people will experience is MUCH longer load times - the kernel shares "Shared Object" files automatically by loading it into ram only once. The tiny program executables suddenly become HUGE when you statically link them. Several orders of magnitude bigger, not just double or 3 times bigger - try it for yourself.

                Statically linked binaries does have a benefit. They do not depend on a working ldd environment. They typically do not depend on all your file system mountpoints being mounted. And so they are often used in emergency situations, such as repair tools.

                Believe me that when people find that their 10MB program is really a 500MB program to install, such as is the case with desktop applications, then you realy appreciate why you do not want to statically link your executables.

                Code:
                johan@Komputer:~/Project/hello$ cat hw.c
                /* Hello World program */
                
                #include<stdio.h>
                
                main()
                {
                    printf("Hello World");
                
                
                }
                johan@Komputer:~/Project/hello$ gcc -o hw-dyn hw.c
                johan@Komputer:~/Project/hello$ gcc -static -o hw-static hw.c
                johan@Komputer:~/Project/hello$ ls -l
                total 892
                -rw-rw-r-- 1 johan johan     86 Nov 21 19:21 hw.c
                -rwxrwxr-x 1 johan johan   8518 Nov 21 19:22 hw-dyn
                -rwxrwxr-x 1 johan johan 895703 Nov 21 19:22 hw-static

                Comment


                  #9
                  Yes... kitchen sink. Do a file search in Dolphine and use locate in a terminal for the same search. Locate is significantly faster because it doesn't have the GUI overhead and talks to the file system directly through libc6. With Dolphine your query navigates through several object layers before the search actually beings, and back through them to display the results.

                  Some statically linked programs might be "orders of magnitude" (100X, 1000X, 10000X, etc...), depending on which libraries were added, but many would not. I was training my replacement on a production app that created a 5MB EXE. She compiled it statically and created a 52MB EXE, and part of that included the database it worked on. It took forever to load and start running, and ran like a dog because of all the memory swapping that was required at the time.

                  BTW, you did read the part about static compiling with gcc being essentially dead, and the reasons why?

                  When I was running KNOPPIX there was a web site that offered a clever workaround for "statically compiled" programs. It involved downloading and installing in the file manager a plug-in. Apps were created by making an ISO file of the development tree, it appeared to me, with all relevant libraries in subdirectories under the app's directory, which contained the binary of the app. Clicking on the ISO called the plugin, which did something similar to the mount + loop command. If the app was a word processor the documents that the user created were stored in that subdirectory. When the app was closed the ISO would umount but include the docs the user made. Very clever. I can't recall the name of the site or the system that was used.

                  EDIT: I couldn't think of that system's name but after I posted the msg I recall something called "klix" or "Klixs", or something like that, but I could not find it in a google search. The web site was really cool. Looked sort of like Appers.
                  Last edited by GreyGeek; Nov 22, 2013, 02:20 PM.
                  "A nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.”
                  – John F. Kennedy, February 26, 1962.

                  Comment


                    #10
                    This was an interesting read.

                    @ OP: I don't think I'd use a search function like the one you have described, but I wouldn't be opposed to it being in the repos (not installed by default).

                    @ GreyGeek: One thing I'd add about statically linked programs is that the storage requirement part of your argument makes good sense for HDDs, but many laptops now come with relatively small SSDs because of a) the cost and b) the push for smaller/thinner devices (e.g. Chromebook Pixel has a 32GB SSD).

                    So, keeping space requirements down is still sensible because it means you can get things done on amazingly thin devices. Next stop: large paperlike screens that have an OS built in, that you can roll up. I think I remember reading about a physical magazine that had 3g connectivity built in to it as a demo thing... can't find the link now but that kind of thing may become more common and would be really useful.
                    samhobbs.co.uk

                    Comment


                      #11
                      Originally posted by GreyGeek View Post
                      Some statically linked programs might be "orders of magnitude" (100X, 1000X, 10000X, etc...), depending on which libraries were added, but many would not. I was training my replacement on a production app that created a 5MB EXE. She compiled it statically and created a 52MB EXE, and part of that included the database it worked on. It took forever to load and start running, and ran like a dog because of all the memory swapping that was required at the time.

                      BTW, you did read the part about static compiling with gcc being essentially dead, and the reasons why?.
                      No, sorry, I don't know what "part about static compiling" you are refering to. Regardless Like I said, try it. Most system utilities are as small as they are purely because of dynamic linking.

                      Remember that many system utilities are being used many times "in parallel", and static linking would mean that
                      a) all the shared code gets reloaded each time (no sharing),
                      b) separately for each program (no cache advantage due to the previousl program having used the same libc/libz/etc a moment ago),
                      c) taking up ram over and over in each loaded application
                      d) taking up bus bandwidth while being copied into ram,
                      e) hogging disk access while being loaded,
                      f) hogging space in every single binary on disk,
                      g) many times hogging the disk cache when not used....

                      libc is 1.3 MB. That is an EXTRA 1.3 MB for practically every single executable out there, I have more than 2000 dynamically linked ELF binary files just under /bin and /usr/bin, and while they don't all use every single function in libc, that is just one library.

                      Statically linking everything would be a ridiculous waste of resources.

                      Comment


                        #12
                        Originally posted by Feathers McGraw View Post
                        .....@ GreyGeek: One thing I'd add about statically linked programs is that the storage requirement part of your argument makes good sense for HDDs, but many laptops now come with relatively small SSDs because of a) the cost and b) the push for smaller/thinner devices (e.g. Chromebook Pixel has a 32GB SSD). ...
                        Good point. I suspect that as Moore's law continues to work the size and speed of SSD's will continue to increase. Perhaps, in less than a decade, 500GB SSD won't be uncommon and 1TB stuff will be in development.

                        I just read a science news report about the discovery of a potential superconductor that can operate at 100 C. It's based on Tin with Fluorine doping.
                        http://www.sciencedaily.com/releases...1121135635.htm
                        "A nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.”
                        – John F. Kennedy, February 26, 1962.

                        Comment

                        Working...
                        X